Fact-checked by Grok 2 weeks ago

Research

Research is a systematic process of involving the collection, , and of to generate new , verify existing understandings, or develop novel applications, often through testing, experimentation, or empirical observation. It spans disciplines from natural sciences to social sciences and , employing methods such as quantitative experiments, qualitative case studies, and computational modeling to establish causal relationships and falsifiable claims grounded in . Central to research methodology are elements like , which outlines the framework for addressing specific questions; via surveys, observations, or lab procedures; and rigorous to ensure validity and reliability. seeks fundamental truths without immediate practical aims, while applied research targets solvable problems, driving innovations in medicine, , and . Its societal value lies in informing evidence-based decisions, fostering technological progress, and addressing challenges like and environmental , though outcomes depend on transparent replication and peer scrutiny. A defining characteristic of robust research is its commitment to , yet widespread failures in replicating findings—termed the —have exposed vulnerabilities, particularly in fields like and , where initial results often do not hold under independent verification, underscoring the need for preregistration, larger samples, and incentives aligned with truth over novelty. Historically, modern research methods evolved from empirical traditions in the , building on and controlled experimentation to replace anecdotal or authority-based with data-driven inference. Despite institutional pressures favoring publishable results, which can introduce biases toward positive outcomes, high-quality research prioritizes causal mechanisms and empirical falsification to advance human understanding.

Definitions and Etymology

Etymology

The English word "research" entered usage in the mid-16th century, around the 1570s, initially denoting a "close search or inquiry" conducted with thoroughness. It derives directly from the Middle French noun recherche, meaning "a searching" or "to go about seeking," which itself stems from the Old French verb recerchier or recercer, implying an intensive or repeated investigation. This term breaks down to the intensive prefix re- (indicating repetition or intensity, akin to "again" or "back") combined with cerchier, meaning "to search" or "to ," ultimately tracing to the Latin circare, "to go around" or "to wander about in a circle," evoking a sense of circling back for deeper examination. By the , the term had solidified in English to encompass systematic inquiry, reflecting its connotation of deliberate, iterative pursuit rather than casual looking.

Core Definitions

Research is defined as a systematic , including research , testing, and , that is designed to develop or contribute to generalizable . This definition, originating from U.S. federal regulations such as the (45 CFR 46), emphasizes a structured, methodical approach rather than ad hoc exploration, distinguishing research from casual by requiring a predetermined for , , and to yield findings applicable beyond the immediate context. In academic and scientific contexts, research entails the rigorous collection of empirical or logical to hypotheses, validate theories, or uncover causal relationships, often involving replicable methods to minimize and ensure reliability. Unlike mere , which may involve open-ended questioning for personal understanding, research demands formal protocols, such as and statistical validation, to produce verifiable results that advance collective knowledge. Key elements include systematicity, referring to a predefined (e.g., experimental or archival ) applied consistently; investigation, encompassing , experimentation, or theoretical modeling; and generalizability, where outcomes must hold potential for broader application, excluding purely internal or operational activities like routine quality assessments. This framework ensures research prioritizes causal realism—identifying true mechanisms over correlative assumptions—while empirical grounding prevents unsubstantiated claims, as seen in fields from physics to social sciences where remains a .

Philosophical Foundations

Epistemology, the philosophical study of , its nature, sources, and limits, underpins research by addressing how investigators justify claims as true. Research paradigms derive from epistemological stances, such as , which posits that knowledge arises from observable, verifiable phenomena through empirical methods, contrasting with , which emphasizes subjective meanings derived from human experience. complements this by examining the nature of reality—whether objective and independent () or socially constructed ()—influencing whether research prioritizes causal mechanisms or interpretive contexts. Ancient foundations trace to Aristotle (384–322 BCE), who integrated empirical observation with logical deduction in works like Physics and Nicomachean Ethics, laying groundwork for systematic inquiry into natural causes. The Scientific Revolution advanced this through empiricism, championed by Francis Bacon (1561–1626), who in Novum Organum (1620) promoted inductive methods to derive general laws from particular observations, critiquing deductive scholasticism for impeding discovery. Rationalism, articulated by René Descartes (1596–1650) in Meditations on First Philosophy (1641), stressed innate ideas and deductive reasoning from self-evident truths, exemplified by his method of doubt to establish certainty. Modern philosophy of science synthesizes these traditions, with (1902–1994) introducing in (1934) as the demarcation criterion for scientific theories, emphasizing empirical refutation over mere confirmation to advance causal understanding. This falsificationist approach counters inductivism's problem of infinite confirmation, prioritizing rigorous testing against reality. While academia often favors paradigms like Kuhn's paradigm shifts (1962), which highlight social influences on theory change, supports realism's focus on mind-independent structures, as untestable constructs risk pseudoscientific claims. Institutional biases in may undervalue dissenting causal models, yet truth-seeking demands scrutiny of such influences to preserve methodological integrity.

Forms and Classifications of Research

Original versus Derivative Research

Original research, also known as primary research, entails the direct collection and analysis of new to address specific questions or test hypotheses, often through methods such as controlled experiments, surveys, or fieldwork. This form of inquiry generates firsthand evidence, enabling researchers to draw conclusions grounded in empirical observations rather than preexisting datasets. For instance, a measuring the of a drug in human subjects qualifies as original research, as it produces unpublished on outcomes like recovery rates or side effects. In , original research appears in peer-reviewed journals as primary literature, where authors detail their , results, and interpretations to contribute knowledge to the field. Derivative research, synonymous with secondary research, involves the synthesis, interpretation, or reanalysis of data and findings already produced by others, without generating new primary data. Common examples include literature reviews that compile and critique existing studies, meta-analyses that statistically aggregate results from multiple original investigations, or theoretical works that reinterpret historical data. This approach relies on the quality and completeness of prior sources, which can introduce cumulative errors or overlooked biases if the foundational data is flawed or selectively reported. While derivative efforts consolidate knowledge and identify patterns across studies—such as in systematic reviews assessing treatment effectiveness—they do not advance the empirical frontier independently. The distinction between original and derivative research underscores differing contributions to knowledge accumulation: original work establishes causal links through , whereas evaluates, contextualizes, or applies those links. In practice, much published blends elements of both, but funding and prestige often favor original endeavors due to their potential for groundbreaking discoveries, though derivative analyses remain essential for validation and policy formulation.
AspectOriginal ResearchDerivative Research
Data SourceNewly collected (e.g., experiments, surveys)Existing data from prior studies
Primary GoalGenerate evidence and insightsSynthesize, analyze, or reinterpret data
ExamplesField observations, lab trialsMeta-analyses, literature reviews
StrengthsDirect causality testing, reduced bias from synthesisIdentifies trends, cost-effective
LimitationsResource-intensive, higher risk of error in novel methodsDependent on quality, potential propagation of flaws

Scientific Research

Scientific research is the systematic investigation of natural phenomena through , experimentation, and to generate new . It involves the planned collection, interpretation, and evaluation of empirical to contribute to scientific understanding. Unlike derivative or non-empirical forms, scientific research prioritizes testable hypotheses and falsifiable predictions, as emphasized by philosopher Karl Popper's criterion that demarcates science from by requiring theories to be capable of being proven wrong through evidence. Key characteristics of scientific research include , relying on observable and measurable evidence; objectivity, minimizing researcher through standardized methods; replicability, allowing independent of results; and systematicity, following structured procedures rather than approaches. These traits ensure that findings are provisional and subject to revision based on new data, fostering cumulative progress in knowledge. The process adheres to the , typically comprising steps such as: making observations to identify a ; formulating a testable ; designing and conducting experiments to gather ; analyzing results statistically; and drawing conclusions while iterating if necessary. This iterative cycle, often visualized as hypothesis testing followed by refinement or rejection, underpins advancements in fields like physics, , and . Reproducibility is foundational, yet challenges persist, as evidenced by the replication crisis where many published results fail independent verification. For instance, a 2015 effort to replicate 100 psychology studies succeeded in only 36% of cases with statistically significant effects matching originals. Surveys indicate nearly three-quarters of biomedical researchers acknowledge a reproducibility crisis, attributed partly to "publish or perish" incentives favoring novel over robust findings. Such issues underscore the need for rigorous statistical practices and preregistration to mitigate biases in data interpretation and publication.

Non-Empirical Research Forms

Non-empirical research derives conclusions through , logical analysis, and theoretical frameworks without collecting or analyzing observational data. This contrasts with , which relies on measurable phenomena observed in world to test hypotheses and generate . Non-empirical methods emphasize a priori knowledge—truths independent of —and are foundational in disciplines where logical consistency supersedes sensory evidence. In mathematics, non-empirical research predominates through the construction and proof of theorems from established axioms using formal logic, yielding results verifiable solely by deduction rather than experiment. For example, the proof of Fermat's Last Theorem by Andrew Wiles in 1994 demonstrated that no positive integers a, b, and c satisfy a^n + b^n = c^n for n > 2, achieved via modular elliptic curves and without empirical testing. Such proofs establish universal truths applicable across contexts, independent of physical reality. Philosophical inquiry represents another core form, involving conceptual analysis, argumentation, and thought experiments to explore metaphysical, ethical, and epistemological questions. Thinkers like employed methodological doubt in the 17th century to arrive at foundational certainties, such as "," through introspective reasoning rather than external observation. Contemporary non-empirical ethics research, for instance, uses argument-based methods to evaluate moral frameworks in technology, prioritizing logical coherence over data from human behavior. Theoretical research in foundational sciences, such as certain aspects of or , also falls under non-empirical forms, where models are refined deductively to uncover structural possibilities. While these methods provide robust, timeless insights—evident in ' role underpinning physics—they face criticism for potential detachment from reality, as untested theories risk irrelevance without eventual empirical linkage, though pure domains like require no such validation.

Applied versus Basic Research

Basic research, also known as fundamental or pure research, seeks to expand the boundaries of human knowledge by investigating underlying principles and phenomena without a predetermined practical goal. It prioritizes theoretical understanding, often through testing and exploratory experiments, such as probing the properties of subatomic particles or genetic mechanisms. In contrast, applied research directs efforts toward solving specific, real-world problems by building on existing knowledge to develop technologies, products, or processes, exemplified by improvements in efficiency based on electrochemical principles. The modern distinction between these categories gained prominence in the mid-20th century, particularly through Vannevar 's 1945 report Science, the Endless Frontier, which positioned as the "pacemaker of technological progress" essential for long-term , while applied research translates discoveries into immediate utility. advocated for federal investment in via institutions like the proposed , arguing it fosters serendipitous breakthroughs that applied efforts alone cannot achieve. This framework influenced U.S. , embedding the dichotomy in funding mechanisms where receives substantial public support—40% of U.S. funding came from the federal government in 2022, compared to 37% from businesses—while applied research draws more from industry. Earlier conceptual roots trace to 18th-century separations of "pure" science from utilitarian pursuits, but 's —basic preceding applied—formalized it amid post-World War II expansion of government-sponsored science. Methodologically, emphasizes open-ended inquiry, replication, and peer-reviewed publication in journals, often yielding foundational theories like , which underpin later applications in . Applied research, however, integrates interdisciplinary teams, prototyping, and iterative testing oriented toward measurable outcomes, such as clinical trials for following basic pharmacological studies. Empirical analyses of citation networks reveal that basic research generates broader, longer-term impacts, with high-citation basic papers influencing diverse fields over decades, whereas applied outputs cluster in narrower, short-term applications. Yet, the boundary is porous: feedback loops exist, as applied challenges refine basic theories, challenging the strict sequentiality of Bush's model. Critics contend the distinction is subjective and policy-driven, potentially distorting by undervaluing hybrid efforts where immediate applicability motivates fundamental . For instance, data show that grants labeled "basic" often yield patentable insights, blurring lines and suggesting the categories serve administrative purposes more than causal realities of discovery. Nonetheless, econometric studies affirm complementarity: investments in enhance applied productivity by 20-30% in sectors like , as foundational knowledge reduces uncertainty in downstream development. This interdependence underscores that while applied research delivers tangible societal benefits—such as vaccines derived from virology basics—sustained progress requires prioritizing basic to avoid depleting the knowledge reservoir upon which applications depend.

The Process of Conducting Research

Key Steps in Research

The research process entails a systematic approach to , often iterative rather than strictly linear, to generate reliable knowledge from or logical deduction. Core steps, as delineated in scientific , begin with identifying a clear grounded in observable phenomena or gaps in existing knowledge. This initial formulation ensures focus and testability, preventing vague pursuits that yield inconclusive results. Subsequent steps involve conducting a thorough to contextualize the question against prior findings, avoiding duplication and refining hypotheses based on established data. A or testable prediction is then formulated, specifying expected causal relationships or outcomes. For , this leads to designing a that controls variables, selects appropriate samples, and outlines procedures to minimize . Data collection follows, employing tools such as experiments, surveys, or observations calibrated for precision and replicability; for instance, in controlled experiments, and blinding techniques are applied to isolate causal s. Analysis then applies statistical or qualitative methods to interpret the , assessing through metrics like p-values or sizes while accounting for potential confounders. Conclusions are drawn only if supported by the , with limitations explicitly stated to facilitate future scrutiny. Finally, results are disseminated via peer-reviewed publications or reports, enabling and building cumulative ; this step underscores the self-correcting of research, where discrepancies prompt reevaluation of prior steps. Deviations from these steps, such as inadequate controls, have historically contributed to erroneous claims later retracted.

Research Methodologies

Research methodologies comprise the planned strategies for , , and interpretation to address research questions systematically. They are broadly classified into quantitative, qualitative, and mixed methods, each suited to different investigative needs based on the nature of the and objectives. Quantitative methodologies emphasize numerical data and statistical analysis to measure variables, test hypotheses, and establish patterns or causal links with a focus on objectivity and generalizability. Common techniques include experiments, surveys with closed-ended questions, and large-scale sampling, where researchers manipulate independent variables—such as in randomized controlled trials assigning participants randomly to or groups—to isolate effects while controlling confounders. These approaches yield replicable results from sizable datasets, enabling precise predictions and broad inferences, though they risk oversimplifying complex human behaviors by prioritizing measurable outcomes over contextual depth. Qualitative methodologies prioritize descriptive, non-numerical data to explore meanings, processes, and subjective experiences, employing methods like in-depth interviews, ethnographic observations, and thematic . Case studies exemplify this by conducting intensive, multifaceted examinations of a single bounded case—such as an or —to uncover intricate in real-world settings. While offering rich, nuanced insights into "how" and "why" phenomena occur, qualitative methods are susceptible to interpretive , smaller sample limitations, and challenges in achieving statistical generalizability. Mixed methods research integrates quantitative and qualitative elements within a single study to capitalize on their respective strengths, such as quantifying trends via surveys and elucidating mechanisms through follow-up interviews, thereby providing a more holistic validation of findings. This convergence approach, as outlined in frameworks like , mitigates individual method weaknesses but demands rigorous integration to avoid methodological conflicts. Other specialized methodologies include correlational designs, which assess variable associations without manipulation to identify potential relationships for further testing, and longitudinal studies tracking changes over time to infer developmental or causal trajectories. Method selection hinges on research goals, with quantitative favoring empirical precision for hypothesis-driven inquiries and qualitative enabling exploratory depth, while mixed methods suit multifaceted problems requiring both breadth and nuance. Empirical rigor in application, including random sampling and validity checks, is essential to counter inherent limitations like or variables across all types.

Tools and Technologies

Laboratory instruments form the backbone of empirical research in fields such as biology, chemistry, and materials science, enabling precise measurement and observation of physical phenomena. Common tools include microscopes for visualizing cellular structures, centrifuges for separating substances by density, and spectrophotometers for analyzing light absorption to determine concentrations. Additional essential equipment encompasses pH meters for acidity measurements, autoclaves for sterilization, and chromatography systems for separating mixtures based on molecular properties. These instruments rely on principles of physics and chemistry to generate reproducible data, though their accuracy depends on calibration and operator skill. Computational tools have revolutionized data analysis across disciplines, allowing researchers to process large datasets efficiently. Programming languages like , with libraries such as for numerical computations and for data manipulation, are widely used for statistical modeling and applications. R serves as a primary tool for statistical analysis and visualization, particularly in bioinformatics and social sciences, offering packages like for graphical representation. Software such as supports simulations and algorithm development in and physics, while tools like Tableau and Power BI facilitate interactive data visualization without extensive coding. Cloud-based platforms, including AWS and Google Cloud, enable scalable storage and for challenges. Citation and reference management software streamlines processes by organizing sources and generating bibliographies. , an open-source tool, collects and annotates references from web pages and databases, integrating with word processors for seamless insertion. Electronic lab notebooks like LabArchives provide digital recording of experiments, enhancing through and searchability. Survey platforms such as support quantitative via online questionnaires, with built-in for preliminary processing. As of 2025, tools are increasingly integrated into research workflows for tasks like generation, literature synthesis, and predictive modeling. Tools such as those leveraging large language models assist in summarizing papers and identifying patterns in datasets, though their outputs require validation to mitigate errors from training data biases. In scientific domains, platforms for molecular modeling accelerate by simulating protein interactions, with empirical studies showing productivity gains in targeted applications. Despite enthusiasm, rigorous evaluation reveals that enhances efficiency in data-heavy fields but does not supplant or experimental design.

Ethics and Integrity in Research

Fundamental Ethical Principles

Fundamental ethical principles in research encompass standards designed to safeguard the integrity of scientific , protect participants and subjects, and ensure the reliability of production. These principles derive from historical precedents, including post-World War II responses to unethical experiments and domestic scandals like the , which prompted formalized guidelines. Core tenets emphasize honesty in data handling, accountability for outcomes, and fairness in resource allocation, countering incentives that might otherwise prioritize publication over truth. A foundational framework is provided by the of 1979, which identifies three basic principles for research involving human subjects: respect for persons, beneficence, and . Respect for persons requires treating individuals as autonomous agents capable of and providing extra protections for those with diminished autonomy, such as children or the cognitively impaired. Beneficence mandates maximizing benefits while minimizing harms, entailing systematic assessment of risks against potential gains and avoidance of unnecessary suffering. Justice demands equitable distribution of research burdens and benefits, preventing exploitation of vulnerable groups and ensuring fair selection of participants. Complementing these, the Singapore Statement on Research Integrity, issued in 2010 by the World Conference on Research Integrity, articulates four universal responsibilities: , , , and . Honesty involves accurate reporting of methods, data, and findings without fabrication, falsification, or selective omission. requires researchers to adhere to ethical norms, report errors, and accept responsibility for misconduct allegations. Professional courtesy promotes open sharing of data and ideas while respecting and avoiding conflicts of interest. Good stewardship obliges efficient use of resources, mentoring of trainees, and dissemination of results to benefit society. Additional principles include objectivity, which necessitates minimizing personal biases through rigorous and peer scrutiny, and , facilitating by mandating detailed documentation of procedures and . The U.S. Office of Research Integrity defines misconduct narrowly as fabrication, falsification, or , underscoring that ethical conduct extends beyond non-violation to proactive pursuit of rigor and fairness. Violations of these principles, often driven by publication pressures or funding dependencies, undermine , as evidenced by retractions exceeding 10,000 annually in biomedical literature by the mid-2010s. Adherence requires institutional mechanisms like institutional review boards, which independently evaluate protocols against these standards prior to initiation.

Research Misconduct and Fraud

Research misconduct is defined as fabrication, falsification, or in proposing, performing, or reviewing research, or in reporting research results, committed intentionally, knowingly, or recklessly. Fabrication involves making up data or results and recording or reporting them as if genuine, while falsification entails manipulating research materials, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record. includes the appropriation of another person's ideas, processes, results, or words without giving appropriate credit. These acts deviate from accepted practices and undermine the integrity of the scientific enterprise, though not all errors or questionable research practices qualify as misconduct. Prevalence estimates for misconduct vary due to reliance on self-reports, which likely understate occurrences, and analyses of retractions, which capture only detected cases. Self-reported rates of fabrication, falsification, or range from 2.9% to 4.5% across studies, with one international survey finding that one in twelve scientists admitted to such acts in the past three years. Questionable research practices, such as selective or failing to disclose conflicts, are more common, with up to 51% of researchers engaging in at least one. Among retracted publications, accounts for the majority: a study of over 2,000 retractions found 67.4% attributable to or suspected (43.4%), (14.2%), or (9.8%), far exceeding error-based retractions. These figures suggest systemic under-detection, exacerbated by pressures in competitive fields like . Principal causes include the "" culture, where career advancement hinges on volume and impact, incentivizing corner-cutting amid competition and tenure demands. Lack of oversight in large labs, inadequate training, and rewards for novel findings over replication further contribute, as do personal factors like ambition or desperation under funding shortages. In , where replication is undervalued and positive results prioritized, these incentives distort behavior, with fraud more likely in high-stakes environments despite institutional norms against it. Notable cases illustrate impacts: In the Hwang Woo-suk scandal, the South Korean researcher fabricated data in 2004-2005 publications, leading to retractions in Science and global scrutiny of claims. Similarly, John Darsee's 1980s fabrications at Harvard and NIH involved inventing experiments across dozens of papers, resulting in over 100 retractions and a ten-year funding ban. Such incidents, often in , highlight how undetected fraud can propagate for years before whistleblowers or statistical anomalies trigger investigations. Consequences encompass professional sanctions, including debarment from federal funding, institutional dismissal, and reputational harm, with eminent researchers facing steeper penalties than novices. Retractions erode citations for affected work and linked studies, diminish impact factors, and foster distrust in science, as seen in rising retraction rates from under 100 annually pre-2000 to thousands today. Broader effects include wasted resources—billions in follow-on research—and policy missteps, such as delayed uptake from fraudulent autism-link claims. Prevention efforts focus on training in responsible conduct, institutional policies for and authorship, and oversight by bodies like the U.S. Office of Research Integrity (ORI), which investigates allegations and enforces agreements for corrections or retractions. Promoting transparency via repositories, preregistration of studies, and incentives for replication can mitigate pressures, though implementation varies, with training alone insufficient without cultural shifts away from publication quantity. Whistleblower protections and rigorous post-publication are also emphasized to detect issues early.

Institutional Review and Oversight

Institutional Review Boards (IRBs), also known internationally as Research Ethics Committees (RECs), serve as primary mechanisms for ethical oversight of research involving human subjects, reviewing protocols to ensure participant rights, welfare, and minimization of risks. Established in the United States following the 1974 , which responded to ethical failures like the , IRBs must evaluate studies for compliance with federal regulations such as the (45 CFR 46), assessing , risk-benefit ratios, and equitable subject selection. Committees typically include at least five members with diverse expertise, including non-scientists and community representatives, to provide balanced scrutiny; reviews can be full board for higher-risk studies, expedited for minimal risk, or exempt for certain low-risk activities like educational surveys. For research involving animals, Institutional Animal Care and Use Committees (IACUCs) provide analogous oversight, mandated by the Animal Welfare Act of 1966 and Public Health Service Policy, conducting semiannual program reviews, inspecting facilities, and approving protocols to ensure humane treatment, the 3Rs principle (replacement, reduction, refinement), and veterinary care. IACUCs, composed of scientists, non-affiliated members, and veterinarians, evaluate alternatives to animal use and monitor ongoing compliance, with authority to suspend non-compliant activities. Globally, similar bodies exist, such as under the European Union's Directive 2010/63/EU, though implementation varies by jurisdiction. Broader institutional oversight addresses research integrity and misconduct through bodies like the U.S. Office of Research Integrity (ORI) within the Department of Health and Human Services, which investigates allegations of fabrication, falsification, or in Public Health Service-funded research, imposes sanctions, and promotes education on responsible conduct. Institutions maintain their own research integrity offices to handle inquiries, often following federal guidelines that require prompt reporting and , with ORI overseeing findings since its in 1993 to centralize responses to misconduct cases. Critics argue that IRB processes impose excessive bureaucracy, causing delays—sometimes months for low-risk studies—and inconsistent decisions across institutions, potentially stifling legitimate research without commensurate improvements in participant protection. Overreach occurs when IRBs review non-research activities like or quality improvement, expanding beyond regulatory intent, as evidenced by complaints from fields like where federal exemptions are ignored. Empirical analyses indicate limited evidence that IRBs reduce harms effectively, with costs in time and resources diverting from core scientific aims, prompting calls for streamlined reviews or exemptions for minimal-risk work. In dual-use research with potential misuse risks, committees' roles remain underdeveloped, highlighting gaps in proactive oversight.

Major Challenges and Systemic Issues

The Replication Crisis

The denotes the systematic failure of numerous published scientific findings to reproduce in independent attempts, casting doubt on the reliability of empirical claims across multiple disciplines. This phenomenon emerged prominently in the early 2010s, particularly in , where a large-scale effort by the Collaboration in 2015 attempted to replicate 100 studies published in top journals in ; only 36% yielded statistically significant results in the direction of the originals, with effect sizes approximately half as large as those initially reported. Ninety-seven percent of the original studies had reported significant effects (p < 0.05), highlighting a stark discrepancy. Similar issues have surfaced in other fields, though rates vary; for instance, a 2021 analysis found 61% replication success for 18 economics experiments and lower rates in cognitive . Replication failures extend beyond psychology to areas like biology and medicine, where preclinical cancer research has shown particularly low reproducibility; one pharmaceutical company's internal checks in 2011-2012 replicated only 11% of 53 high-profile studies. In economics, community forecasts anticipate around 58% replication rates, higher than in psychology or education but still indicative of systemic unreliability. Fields with stronger experimental controls, such as physics, exhibit fewer such problems due to larger-scale validations and less reliance on small-sample statistical inference, though even there, isolated high-profile disputes occur. Overall, the crisis underscores that much of the published literature may overestimate effect sizes due to selective reporting, eroding the foundational assumption of cumulative scientific progress. Primary causes include publication bias, where journals preferentially accept novel, positive results while null or contradictory findings languish unpublished, inflating the apparent rate of "discoveries." Questionable research practices exacerbate this: p-hacking involves flexibly analyzing data (e.g., excluding outliers or testing multiple outcomes) until a statistically significant result (p < 0.05) emerges by chance, while HARKing entails retrofitting hypotheses to fit observed data post-analysis. Low statistical power from underpowered studies—often using small samples to detect implausibly large effects—further compounds the issue, as true effects require replication with adequate power to distinguish signal from noise. These practices stem from academic incentives prioritizing quantity and novelty for tenure and funding over rigorous verification, with replication studies rarely published or funded. The crisis has profound implications, including eroded public trust in science, misallocation of resources toward building on false premises, and slowed progress in applied domains like medicine, where non-replicable preclinical findings delay effective therapies. It also reveals flaws in peer review, which often fails to detect inflated claims, and highlights how institutional pressures in academia—dominated by metrics like citation counts—favor sensationalism over truth-seeking. In response, reforms emphasize transparency and rigor: pre-registration of hypotheses and analysis plans on platforms like OSF.io commits researchers before data collection, mitigating p-hacking and HARKing. Open science initiatives promote sharing raw data, code, and materials, enabling independent verification, while calls for larger samples and Bayesian methods over rigid p-value thresholds aim to enhance power and inference. Post-crisis, psychological studies show trends toward stronger effects, bigger samples, and fewer "barely significant" results, suggesting gradual improvement. Dedicated replication journals and funding for verification efforts, alongside cultural shifts away from "publish or perish," represent ongoing efforts to realign incentives with reproducibility.

Biases in Research

Biases in research encompass systematic deviations from true effects, arising from cognitive, methodological, or institutional factors that skew study design, execution, or reporting. These errors undermine the reliability of scientific claims, with empirical evidence showing their prevalence across disciplines, particularly in fields reliant on subjective interpretation like psychology and social sciences. For instance, confirmation bias leads researchers to selectively seek or interpret data aligning with preconceptions, often embedded in experimental design through choice of hypotheses or data analysis paths that favor expected outcomes. Observer bias further compounds this by influencing data collection based on researchers' expectations, as seen in studies where subjective assessments yield results correlated with the observer's prior beliefs rather than objective measures. Methodological biases, such as selection and sampling bias, distort participant or data inclusion, producing non-representative results; for example, convenience sampling in clinical trials can overestimate treatment effects if healthier subjects are disproportionately included. Publication bias exacerbates these issues by favoring studies with statistically significant or positive findings, with meta-analyses in psychology revealing that up to 73% of results lack strong evidence due to selective reporting, artificially inflating effect sizes in the literature. In medicine, this manifests in overestimation of drug efficacy, as negative trials remain unpublished, distorting clinical guidelines. Funding or sponsorship bias occurs when financial supporters influence outcomes to align with their interests, evident in industry-sponsored research where positive results for the sponsor's product appear 3-4 times more frequently than in independent studies. Examples include pharmaceutical trials selectively reporting favorable data or nutritional studies funded by food industries downplaying risks of high-fructose corn syrup. Ideological biases, particularly pronounced in academia, stem from the overrepresentation of left-leaning scholars—such as at where only 1% of faculty identify as conservative—leading to skewed research agendas that underexplore or dismiss hypotheses conflicting with progressive priors, like in social psychology where conservative viewpoints face hiring and publication barriers. This systemic imbalance, with faculty political donations to Democrats outnumbering Republicans by ratios exceeding 10:1 in humanities and social sciences, fosters causal interpretations favoring environmental over genetic factors in behavior or policy outcomes that prioritize equity narratives over empirical trade-offs. Mitigating biases requires preregistration of protocols, blinded analyses, and diverse research teams, though institutional incentives like tenure tied to publication volume perpetuate them; empirical audits, such as those revealing 50-90% exaggeration in effect sizes due to combined biases, underscore the need for skepticism toward uncorroborated claims from ideologically homogeneous fields. Mainstream academic sources often understate ideological distortions, attributing discrepancies to "facts" rather than selection effects, yet surveys confirm self-censorship among dissenting researchers due to peer hostility.

Publication and Peer Review Flaws

Peer review serves as the primary mechanism for validating scientific manuscripts prior to publication, yet empirical evidence reveals systemic deficiencies that undermine its reliability as a quality filter. Studies demonstrate that peer review frequently fails to detect methodological errors or fraud, with experiments introducing deliberate flaws into submissions showing that reviewers miss most issues, as evidenced by a 1998 study where only a fraction of injected errors were identified. The process is subjective and prone to inconsistencies, with little rigorous data confirming its efficacy in improving manuscript quality or advancing scientific truth. Publication bias exacerbates these flaws by systematically favoring results with statistical significance or positive findings, distorting the scientific record and hindering meta-analyses. Defined as the selective dissemination of studies based on outcome direction or magnitude, this bias leads to overrepresentation of confirmatory evidence, as non-significant results face higher rejection rates from journals. Quantitative assessments indicate that this skew can inflate effect sizes in systematic reviews by up to 30% in fields like psychology and medicine, perpetuating erroneous conclusions until replication efforts reveal discrepancies. Biases inherent in peer review further compromise objectivity, including institutional affiliation favoritism, where manuscripts from prestigious universities receive more lenient scrutiny, disadvantaging researchers from lesser-known institutions. Ideological predispositions also influence evaluations, as shown in experiments where reviewers rated identical research on contentious topics like migration policy more favorably when aligned with prevailing academic paradigms, often reflecting left-leaning institutional norms that prioritize certain interpretive frameworks over empirical rigor. Such biases, compounded by anonymity, enable ad hominem attacks or confirmation of entrenched views, as documented in analyses of review processes across disciplines. The rise in retractions underscores peer review's inability to prevent flawed or fraudulent work from entering the literature, with biomedical retractions quadrupling from 2000 to 2021 and exceeding 10,000 globally in 2023 alone. Misconduct, including data fabrication, accounts for the majority of these withdrawals, with rates increasing tenfold since 1975, often undetected during initial review due to inadequate scrutiny of raw data or statistical practices. This trend signals not only heightened vigilance via post-publication audits but also foundational weaknesses in pre-publication gatekeeping, where resource constraints and reviewer overload—exacerbated by unpaid labor—prioritize speed over thoroughness. Additional operational flaws include protracted delays averaging 6-12 months per review cycle and high costs borne by journals without commensurate benefits, fostering predatory publishing alternatives that bypass rigorous checks. These issues collectively erode trust in published research, prompting calls for reforms like open review or statistical auditing, though evidence of their superiority remains preliminary.

Funding and Incentive Distortions

Scientific research is heavily influenced by funding mechanisms that prioritize measurable outputs, such as publications and grants, over long-term reliability or exploratory work. The "publish or perish" paradigm, where career advancement depends on publication volume, incentivizes researchers to produce numerous papers rather than rigorous, replicable findings, contributing to increased retractions and lower overall research quality. Hyper-competition for limited grants exacerbates this, with scientists spending substantial time on proposal writing—up to 40% of their effort—diverting resources from actual experimentation. This structure favors incremental, citation-maximizing studies over novel or null-result research, leading to stagnation in groundbreaking discoveries. Grant allocation processes introduce directional biases, steering research toward funder-preferred topics like high-impact or applied fields, while destabilizing foundational work through short-term funding cycles. Industry sponsorship, a significant funding source, correlates with outcomes favoring sponsors' interests, such as selective reporting or design choices that inflate efficacy. Government funding, which dominates public science, amplifies these issues; surveys indicate 34% of federally funded U.S. scientists have admitted to misconduct, including data manipulation, to align results with grant expectations. Peer-reviewed grants often perpetuate conformity, as reviewers favor proposals mirroring established paradigms, suppressing disruptive ideas. These incentives directly fuel the replication crisis by devaluing verification studies, which offer few publications or grants compared to original "positive" findings. Researchers face no systemic rewards for replication, despite evidence that up to 50% of studies in fields like psychology fail to reproduce, eroding trust in scientific claims. Funder emphasis on novelty and societal impact further marginalizes replications, creating a feedback loop where unreliable results propagate. Reforms, such as funding dedicated replication teams or rewarding quality metrics over quantity, have been proposed but face resistance due to entrenched career incentives.

Professionalization and Institutions

Training and Career Paths

Training for research careers typically begins with an undergraduate degree in a relevant discipline, followed by enrollment in a doctoral program. The , as the cornerstone of advanced research training, emphasizes original investigation, data analysis, and scholarly communication, often spanning 5 to 11 years in total duration, inclusive of coursework, comprehensive examinations, and dissertation research. In biomedical sciences, median time to degree ranges from 4.88 to 5.73 years across subfields. Completion rates vary by discipline, with approximately 57% of candidates finishing within 10 years and 20% within 7 years, influenced by funding availability and program structure. Postdoctoral fellowships commonly follow the PhD, providing 1 to 5 years of mentored research to build publication records, grant-writing skills, and independence required for permanent roles. These positions, often temporary and funded by grants or institutions, function as an extended apprenticeship, though they increasingly serve as a holding pattern amid limited faculty openings. In the United States, postdoctoral training hones not only technical expertise but also management and collaboration abilities essential for leading labs or teams. Academic career progression traditionally involves securing a tenure-track assistant professorship after postdoc experience, followed by evaluation for tenure after 5 to 7 years based on research output, teaching, and service. However, success rates remain low: fewer than 17% of new PhDs in science, engineering, and health-related fields obtain tenure-track positions within 3 years of graduation. By 2017, only 23% of U.S. PhD holders in these areas occupied tenured or tenure-track academic roles, a decline from prior decades. In computer science, the proportion advancing to tenured professorships stands at about 11.73%. Engineering fields show similar constraints, with an average 12.4% likelihood of securing tenure-track jobs over recent years. Beyond academia, PhD recipients pursue diverse paths in industry, government, and non-profits, leveraging analytical and problem-solving skills. Common roles include research scientists in private R&D, data scientists, policy analysts, and consultants, where private-sector employment now rivals academic hires in scale. Medical science liaisons and environmental analysts represent specialized applications, often offering higher initial salaries than academic starts but less autonomy in pure research. Systemic challenges arise from an oversupply of PhDs relative to academic positions, exacerbating competition and prolonging insecure postdoc phases that function as low-paid labor for grant-funded projects. Universities sustain PhD production to meet teaching and research demands via graduate assistants, yet this model yields far more doctorates than faculty slots, with only 10-30% securing permanent academic roles depending on field. This imbalance fosters career uncertainty, prompting calls for better preparation in non-academic skills and transparency about job prospects during training.

Academic and Research Institutions

Academic and research institutions, encompassing universities and specialized research centers, represent the institutional backbone of organized scientific inquiry, evolving from medieval teaching-focused universities to modern entities that integrate education, discovery, and application. The modern research university model originated in early 19th-century Prussia with Wilhelm von Humboldt's vision at the in 1810, emphasizing the unity of research and teaching to foster original knowledge production. This paradigm spread globally, particularly influencing the United States, where , founded in 1876, became the first explicitly research-oriented institution, prioritizing graduate training and specialized scholarship over undergraduate instruction alone. By the late 19th century, American public universities adopted similar structures, expanding graduate programs and research facilities, which propelled advancements in fields like physics and biology. In contemporary practice, these institutions conduct the majority of fundamental research, providing infrastructure such as laboratories, archives, and computational resources essential for empirical investigation and theoretical development. Universities train future researchers through doctoral programs, where students contribute to faculty-led projects, thereby perpetuating expertise while generating new data and publications. They also oversee ethical compliance via institutional review boards, which evaluate study designs for risks to human and animal subjects, though implementation varies and can introduce bureaucratic delays. Beyond universities, dedicated research institutes like Germany's or the United States' focus on targeted domains, often collaborating with academia to translate findings into practical outcomes. However, systemic challenges undermine their efficacy, including heavy reliance on competitive grant funding, which favors incremental, grant-attractive projects over high-risk, foundational work. The tenure-track system, designed to safeguard intellectual independence, frequently incentivizes prolific but superficial output to meet promotion criteria, with post-tenure productivity sometimes declining as measured by publication rates. Ideological homogeneity prevails, with approximately 60% of faculty in the humanities and social sciences identifying as liberal or far-left, correlating with reduced viewpoint diversity and potential suppression of heterodox inquiries, as evidenced by self-censorship surveys among academics. This imbalance, more pronounced in elite institutions, can distort research priorities toward prevailing narratives, as seen in uneven scrutiny of politically sensitive topics.

Publishing and Dissemination

Scientific publishing primarily occurs through peer-reviewed journals, where researchers submit manuscripts detailing their findings, methodologies, and analyses for evaluation by independent experts before acceptance. The process typically involves initial editorial screening, peer review for validity and novelty, revisions based on feedback, and final production including copy-editing and formatting. In 2022, global output of science and engineering articles reached approximately 3.3 million, with China producing 898,949 and the United States 457,335, reflecting the scale and international distribution of dissemination efforts. Preprints have emerged as a key mechanism for rapid dissemination, enabling authors to share unrefereed versions of their work on public servers such as for physics and mathematics or for biology, often months before formal publication. This approach accelerates knowledge sharing, allows community feedback to refine research, and has gained prominence, particularly during the when preprints facilitated timely updates on evolving data. However, preprints lack formal validation, prompting journals to increasingly integrate them into workflows by reviewing posted versions or encouraging prior deposition. Open access (OA) models have transformed dissemination by removing paywalls, with gold OA—where articles are immediately freely available upon publication—rising from 14% of global outputs in 2014 to 40% in 2024. This shift, driven by funder mandates and institutional policies, contrasts with subscription-based access, though it introduces article processing charges that can burden authors and strain society publishers' revenues amid rising costs. Hybrid models and diamond OA (no-fee, community-supported) address some barriers, but predatory OA journals exploiting these trends underscore the need for rigorous vetting. Conferences complement journal publication by providing platforms for oral presentations, posters, and networking, enabling real-time dissemination and critique of preliminary or complete findings. Events organized by professional societies or field-specific bodies, such as those in health sciences or physics, foster collaboration and often lead to subsequent publications, though virtual formats have expanded access post-2020. Beyond these, supplementary methods like data repositories, policy briefs, and targeted media outreach extend reach, prioritizing empirical validation over broad publicity.

Economics and Global Context

Research Funding Sources

Research funding derives primarily from four categories: government agencies, private industry, higher education institutions, and philanthropic foundations or nonprofits. Globally, total gross domestic expenditure on research and development (GERD) approached $3 trillion in 2023, with the United States and China accounting for nearly half of this total through combined public and private investments. In high-income economies, business enterprises typically fund 60-70% of overall R&D, emphasizing applied and development-oriented work, while governments allocate a larger share—often over 40%—to basic research. Government funding constitutes the backbone of basic and public-good research, channeled through national agencies and supranational programs. In the United States, federal obligations for R&D totaled $201.9 billion in the proposed fiscal year 2025 budget, with key performers including the National Institutes of Health (NIH), which supports biomedical research; the National Science Foundation (NSF), focusing on foundational science; and the Department of Energy (DOE), advancing energy and physical sciences. These agencies funded 40% of U.S. basic research in 2022, prioritizing investigator-initiated grants amid competitive peer review processes. In the European Union, the Horizon Europe program disburses billions annually for collaborative projects across member states, with the European Commission awarding over 2,490 grants in recent cycles, often targeting strategic areas like climate and digital innovation. China, investing heavily in state-directed R&D, channels funds through ministries and programs like the National Natural Science Foundation of China, supporting rapid scaling in fields such as artificial intelligence and quantum technologies, with public expenditures exceeding those of the U.S. in higher education and government labs by 2023. Private industry provides the largest volume of funding in market-driven economies, directing resources toward commercially viable innovations. In the U.S., businesses financed 69.6% of GERD in recent years, performing $602 billion in R&D in 2021 alone, predominantly in sectors like pharmaceuticals, technology, and manufacturing where intellectual property yields direct returns. This sector contributed 37% of basic research funding in 2022, often through corporate labs or partnerships with academia, though priorities align with profit motives rather than pure knowledge advancement. Globally, industry R&D intensity—measured as expenditure relative to GDP—reaches 2-3% in OECD countries, with firms like those in semiconductors and biotech recouping investments via patents and market dominance. Higher education institutions and philanthropic entities supplement these sources with intramural funds and targeted grants. U.S. universities expended $59.6 billion in federal-supported R&D in fiscal year 2023, but also drew 5% from state/local governments and internal revenues, enabling flexibility in exploratory work. Private foundations account for about 6% of academic R&D, with examples including the funding global health initiatives and the supporting biomedical training, typically awarding grants from $15,000 to over $500,000 per project. These sources, while smaller in scale, often fill gaps in high-risk or interdisciplinary areas overlooked by larger funders.

International Variations and Statistics

Global research and development (R&D) expenditures exhibit stark international disparities, with advanced economies dominating total spending while select nations prioritize intensity relative to GDP. In 2023, OECD countries collectively allocated approximately 2.7% of GDP to R&D, totaling around $1.9 trillion, though non-OECD performers like China contribute substantially to aggregate figures. The United States led in absolute R&D outlays at over $700 billion in 2022, followed closely by China, which surpassed $500 billion amid rapid state-driven expansion. In terms of R&D intensity, Israel invested 5.56% of GDP in 2022, South Korea 4.93%, and Belgium 3.47%, contrasting with lower shares in emerging markets like India (0.64%) and Brazil (1.15%). These variations reflect differing economic structures, policy emphases, and institutional capacities, where high-intensity nations often feature concentrated business-sector investments. Scientific publication output further highlights quantity-driven divergences, particularly Asia's ascent. In 2023, China produced over 1 million science and engineering articles, accounting for about 30% of global totals exceeding 3 million, while the United States output around 500,000. India and Germany followed with over 100,000 each, underscoring a shift from Western dominance; China's volume has grown via incentives like publication quotas, though this correlates with proliferation in lower-tier journals. High-quality output, per metrics tracking contributions to elite journals, saw China edging the U.S. in share for 2023-2024, yet U.S. publications maintain superior average citation rates, with 20-30% higher impact in fields like biomedicine.
CountryR&D as % GDP (2022)Total Publications (2023)Avg. Citations per Paper (est. recent)
United States3.46~500,000High (leads globally)
China2.40>1,000,000Moderate (quantity bias)
4.93~80,000Above average
3.13~110,000High
3.30~70,000High
Human capital density amplifies these patterns, with researcher counts per million inhabitants ranging from over 8,000 in and to under 1,000 in many developing nations. The averaged around 5,000 in recent years, bolstered by coordinated funding, while China's absolute researcher base exceeds 2 million but yields lower per-capita productivity due to uneven . Citation-based rankings reinforce quality gaps, with the U.S. topping global aggregates at over 16 million documents cited extensively, versus China's focus on volume often critiqued for self-citation and issues in state-influenced outputs. These metrics, drawn from databases like , reveal causal links between institutional freedom, funding stability, and sustained impact, beyond mere volume.

Private Sector and Market-Driven Research

The private sector accounts for the predominant share of global research and development (R&D) funding and performance, with businesses in the United States alone conducting $673 billion of the $892 billion total domestic R&D in 2022, or roughly 75%, surpassing federal government funding of $164 billion. This pattern reflects a broader trend where private investment has grown faster than public sources over recent decades, contributing to global R&D expenditures nearing $3 trillion in 2023—a near tripling since 2000—despite economic disruptions. In the European Union, business enterprise R&D represented about two-thirds of total R&D expenditure in 2023, underscoring the sector's role in driving applied innovation oriented toward marketable outcomes. Market-driven research operates under profit incentives that prioritize projects with demonstrable commercial viability, enabling swift adaptation to technological and consumer demands through competitive pressures. This contrasts with public-sector approaches often constrained by bureaucratic allocation and lower tolerance for failure, as private firms must justify investments via returns, fostering efficiency in resource use and iteration. Empirical outcomes include accelerated advancements in fields like , where private entities dominated 73% of key milestones in analyzed portfolios, from preclinical testing to market approval. For example, Moderna's proprietary mRNA platform, developed through over a decade of private R&D starting in 2010, enabled the company's to enter phase 3 trials by July and receive emergency authorization months later, highlighting the sector's capacity for rapid scaling under market urgency. In technology sectors, private R&D has yielded foundational innovations such as the , invented at in 1947, which underpinned the revolution and subsequent advancements. Similarly, competition among firms has sustained exponential performance gains, with U.S. private R&D intensity reaching 2.57% of GDP in 2022, supporting iterative improvements that outpace theoretical predictions like . Space exploration provides another case: SpaceX's reusable technology, funded internally since 2002, reduced launch costs by orders of magnitude, achieving the first private crewed orbital mission in 2020 and enabling constellations like . These examples illustrate how market signals—via investor capital and revenue potential—direct resources toward high-impact, scalable solutions, often filling gaps left by slower public initiatives. Challenges persist, including potential underinvestment in pure lacking immediate applications, as private agendas favor proprietary gains over open dissemination. Nonetheless, the sector's dominance in funding—evident in global corporate R&D growth of 6.1% in —demonstrates its effectiveness in generating economic value, with innovations spilling over to broader societal benefits through and .

Recent Developments and Future Directions

Integration of Artificial Intelligence

Artificial intelligence (AI) has transformed research methodologies by automating data processing, enabling predictive modeling, and augmenting hypothesis generation across disciplines. In , DeepMind's , released in 2021, achieved near-atomic accuracy in predicting protein structures, solving a 50-year challenge and generating models for approximately 200 million proteins through the developed with EMBL-EBI. This integration has expedited downstream applications, such as variant effect prediction and drug target identification, with studies reporting reduced reliance on costly experiments; for instance, models have informed over 1.9 million experimental structures deposited in the since 2021. AI tools have similarly streamlined literature synthesis and in . Elicit, an AI-powered platform, indexes over 125 million papers to perform semantic searches, extract structured data like study outcomes, and generate summaries, thereby compressing weeks of manual review into hours for researchers. Complementary systems, such as SciSpace and Research Rabbit, leverage citation networks and to map research landscapes, identify knowledge gaps, and automate reference curation, with adoption rising among academics for handling exponential publication growth—global scientific output exceeded 3 million papers annually by 2023. In fields like , generative AI aids protocol design and evidence synthesis, as evidenced by reviews of 2023–2025 literature showing its role in accelerating outbreak modeling and intervention evaluation. Generative AI further integrates into simulation-driven research, producing adaptive models that emulate complex phenomena beyond traditional numerical methods; for example, 2025 advancements in flow-matching techniques predict electron redistribution in chemical reactions with high fidelity, aiding materials science discovery. Benchmark improvements, per the 2025 AI Index, reflect AI's scaling in tasks like multimodal reasoning, supporting interdisciplinary applications from physics simulations to econometric forecasting. Despite these gains, AI integration introduces reproducibility and bias challenges that undermine causal validity. Opaque training processes and proprietary datasets often preclude exact replication, with studies highlighting failures in documenting hyperparameters or data provenance as primary barriers in machine learning experiments. Algorithmic bias, arising from skewed input corpora—frequently drawn from institutionally biased archives—propagates errors, as in medical imaging where underrepresented demographics yield disparate predictive accuracies, potentially exacerbating inequities without rigorous debiasing. Empirical validation against controlled experiments remains essential, as AI excels in pattern recognition but falters in establishing causality absent human-guided first-principles scrutiny.

Advances in Open Science

Open science encompasses practices that enhance the accessibility, transparency, and reproducibility of research outputs, including publishing, dissemination, sharing, and adherence to principles like (Findable, Accessible, Interoperable, Reusable). Advances since 2020 have accelerated due to policy mandates, technological infrastructure, and crises like the , which underscored the value of rapid sharing. For instance, the European University Association's Agenda 2025 outlines priorities for reforming amid data-driven science, emphasizing institutional repositories and equitable access. Open access (OA) publishing has seen substantial growth, with revenues rising from $1.9 billion in 2023 to $2.1 billion in 2024, projected to reach $3.2 billion by 2028, driven by hybrid and fully OA journals. Springer Nature reported that 44% of its primary research articles were published OA in 2024, up from 38% in 2022, correlating with higher usage and citations. Preprint servers have similarly expanded, with platforms like bioRxiv and medRxiv transitioning to nonprofit structures in 2025 to broaden scope beyond COVID-related topics, posting over 12,000 non-pandemic preprints in 2024 alone; this shift has enabled faster dissemination, with many universities now crediting preprints in hiring and promotion decisions. Open data initiatives have advanced through repositories facilitating reuse, such as those aligned with FAIR principles, first formalized in 2016 and now integrated into policies like those of the U.S. National Institutes of Health (NIH). These principles have improved data findability and interoperability, boosting innovation; for example, open data has contributed to economic opportunities and solutions for public problems by enabling secondary analyses. Despite challenges like AI-generated content infiltrating preprints, moderation efforts and evolving peer review models continue to enhance credibility and replicability in open science ecosystems.

Responses to Ongoing Crises

In the wake of the , research has emphasized enhanced surveillance and rapid-response mechanisms for infectious diseases, including the expansion of genomic sequencing networks to track variants and the development of platform technologies like mRNA for faster deployment against future outbreaks. Global public funding for pandemic preparedness has surged, with initiatives such as the allocating over $2 billion since 2020 to preclinical and clinical trials for broad-spectrum vaccines. However, empirical assessments indicate persistent gaps in equitable access and real-world efficacy testing, as initial emergency authorizations prioritized speed over long-term data collection. Climate change and energy security have driven reallocations in research priorities, with public investments in energy innovation across eight major economies rising 84% from $10.9 billion in 2001 to $20.1 billion in 2018, accelerating further post-2022 due to supply disruptions from the Russia-Ukraine conflict. Studies have quantified trade-offs between decarbonization and reliability, revealing that aggressive net-zero policies can exacerbate short-term energy shortages without corresponding advances in storage or baseload alternatives like . European Union-funded projects, for instance, integrate modeling of polycrises—combining impacts, geopolitical , and resource scarcity—to inform industrial policies, though causal analyses highlight how subsidies often favor intermittent renewables over dispatchable sources, delaying net security gains. Antimicrobial resistance (AMR), projected to cause 10 million annual deaths by 2050 if unchecked, has prompted targeted advances in diagnostics, programs, and novel therapies such as phage-based treatments and CRISPR-edited , with global surveillance systems like the WHO's expanding to over 100 countries by 2024. Environmental research links AMR dissemination to agricultural runoff and wastewater, advocating integrated monitoring that ties it to biodiversity decline, where microbiome engineering shows promise in restoring resilience against resistant pathogens. Despite these efforts, funding disparities persist; for example, U.S. climate-health research receives under $3 million yearly in federal extramural grants, limiting causal insights into vector-borne disease surges. Empirical data underscore that pollution controls could curb AMR spread by 30-50% in high-burden regions, yet implementation lags due to regulatory fragmentation.

References

  1. [1]
    What is research? | WMU Journal of Maritime Affairs
    Dec 16, 2021 · Research is defined as the creation of new knowledge and/or the use of existing knowledge in a new and creative way so as to generate new concepts.
  2. [2]
    Research
    Oct 23, 2022 · Research is a systematic, exhaustive, and intensive investigation and study of a topic, often employing hypothesis and experimentation, to discover new ...
  3. [3]
    What is Research Methodology? Definition, Types, and Examples
    Aug 28, 2023 · It includes all the important aspects of research, including research design, data collection methods, data analysis methods, and the overall ...What are the types of sampling... · Streamline Your Research...
  4. [4]
    Qualitative Research, History of - Sage Research Methods
    The beginnings of qualitative research, according to Vidich and Lyman, are located in the work of early ethnographers during the 17th century.
  5. [5]
    [PDF] Methodology Section for Research Papers - San Jose State University
    Your methodology should begin by describing your research question and the type of data you used in answering it. You want to indicate why this type of data is ...
  6. [6]
    How to Write Research Methodology for 2025: Overview, Tips, and ...
    A research paper's methodology section must shed light on how you were able to collect or generate your research data and demonstrate how you analyze them.
  7. [7]
    Definitions of Research and Development: An Annotated ...
    May 19, 2022 · This document provides definitions of research and development from US and international sources. The first section (I) presents statistical definitions of R&D.
  8. [8]
    Perspective: the role of science in society - PMC - NIH
    Apr 5, 2025 · Knowledge derived by science informs societal behaviors and policies. Furthermore, technologies discovered via science can be circular in that ...
  9. [9]
    Benefits of Conducting Research | First at LAS | University of Illinois ...
    Research enhances academic fields, helps society, increases student skills, and prepares students for careers, with employers valuing research experience.
  10. [10]
    The replication crisis has led to positive structural, procedural, and ...
    Jul 25, 2023 · The replication crisis has highlighted the need for a deeper understanding of the research landscape and culture, and a concerted effort ...
  11. [11]
    Measuring the societal impact of research - NIH
    Research is less and less assessed on scientific impact alone—we should aim to quantify the increasingly important contributions of science to society.
  12. [12]
    Chapter 1 History and Research Methods | Cognitive Foundations
    Qualitative designs, including participant observation, case studies, and narrative analysis are examples of such methodologies. For instance, detailed case ...
  13. [13]
    Research Methods: A History of Some Important Strands.
    Wilhelm Wundt's distinction between experimental methods appropriate for understanding simple processes and naturalistic methods appropriate for ...<|control11|><|separator|>
  14. [14]
    [PDF] A Study of the Importance of Academic Research in Social Sciences ...
    Sep 4, 2019 · Research in Social Sciences is as well important in any society as it helps in the cultural, aesthetic, spiritual, social and educational ...<|separator|>
  15. [15]
    Research - Etymology, Origin & Meaning
    Originating in the 1570s from French and Old French recercher, the word means to search closely or investigate thoroughly, derived from Latin circus meaning ...
  16. [16]
    Etymology of "research"? - meaning - English Stack Exchange
    Mar 11, 2012 · Since Middle French, from Old French recerchier, from re- + cerchier (“to look for”). Compare English research and recherche. Old French: ...
  17. [17]
    Definitions | ORI - The Office of Research Integrity
    The Common Rule defines research as “systematic investigation, including research development, testing and evaluation, designed to develop or contribute to ...
  18. [18]
    Definitions and Categories of Research
    Research means a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge.
  19. [19]
    Definitions of Human Subjects and Research - Boise State University
    A “systematic investigation” is typically a predetermined method for studying a specific topic, answering a specific question(s), testing a specific hypothesis( ...
  20. [20]
    Does Your Study Require IRB Review?
    The Office for Human Research Protections (OHRP) defines research as a systematic investigation, including research development, testing and evaluation, ...
  21. [21]
    Definition of Academic Research for 2025
    Research is part and parcel of academic life, but what is academic research? It is what trains students to foster critical thinking and analytical skills.
  22. [22]
    What Is Academic Research? Definition, Purpose & Process
    Jul 30, 2025 · Academic research is a systematic and critical investigation rooted in structured inquiry and critical thinking skills.
  23. [23]
    What is the Difference Between Inquiry and Research - Pediaa.Com
    Aug 3, 2021 · Inquiry is the process of finding answers to questions, whereas research is the systematic and formal investigation and study of materials and sources.
  24. [24]
    Understanding the Difference Between Inquiry and Research
    Inquiry emphasizes exploration and discovery, while research focuses on establishing facts and conclusions through systematic study. Inquiry is broad and ...
  25. [25]
    Step 1. Is Your Project Considered Research? - UW Research
    Systematic investigation: A detailed or careful examination that has or involves a prospectively identified approach to the activity based on a system, method, ...
  26. [26]
    Systematic investigation Definition | Law Insider
    Systematic investigation means an activity that involves a retrospective or prospective research plan that incorporates data collection.
  27. [27]
    Philosophical Foundations of Research: Epistemology
    Nov 13, 2015 · Epistemology is the study of the nature of knowledge. It deals with questions as is there truth and or absolute truth, is there one way or many ...
  28. [28]
    A guide to ontology, epistemology, and philosophical perspectives ...
    May 2, 2017 · By looking at the relationship between a subject and an object we can explore the idea of epistemology and how it influences research design.
  29. [29]
    The role of epistemology and ontology in research design
    Sep 19, 2025 · Epistemology and ontology are more than abstract ideas; they're the philosophical foundations of research design. By making your assumptions ...
  30. [30]
    Key Philosophers of Science to Know for Philosophy of Science
    Aristotle. Developed the concept of empirical observation and systematic classification of knowledge. · Francis Bacon. Advocated for the scientific method based ...
  31. [31]
    The philosophy of science
    Arguably the founder of both science and philosophy of science. · Francis Bacon (1561-1626) · Rene Descartes (1596-1650) · Piere Duhem ( ...
  32. [32]
    Rationalism vs. Empiricism - Stanford Encyclopedia of Philosophy
    Aug 19, 2004 · Rationalism and empiricism differ on how much we rely on experience. Rationalism includes innate knowledge, while empiricism sees experience as ...
  33. [33]
    James Robert Brown (ed.), Philosophy of science: the key thinkers
    Abstract. From the 19th century the philosophy of science has been shaped by a group of influential figures. Who were they? Why do they matter?
  34. [34]
    Philosophical foundations of research, and the case of the epistemic ...
    Philosophical foundations are the core of each individual researcher and all research questions, hypothesis, methodologies, recommendations are shaped by it.
  35. [35]
    Primary vs Secondary Research – What's the Difference? - Qualtrics
    Primary vs secondary research: in a nutshell. The essential difference between primary and secondary research lies in who collects the data.
  36. [36]
    Primary vs. Secondary Research - Research Methods - LibGuides
    Aug 4, 2025 · Primary research involves collecting original data directly from sources through methods you design and implement. As the researcher, you have ...
  37. [37]
    Types of research article | Writing your paper - Author Services
    Original research articles are the most common type of journal article. They're detailed studies reporting new work and are classified as primary literature.Missing: derivative | Show results with:derivative
  38. [38]
    Primary Vs. Secondary Research: What's The Difference?
    Primary research creates new data, while secondary research analyzes existing research. Tertiary research goes one step further by synthesizing both primary and ...
  39. [39]
    QuickLesson 10: Original Records, Image Copies, and Derivatives
    Jul 28, 2012 · Evidence comes in two basic classes: direct and indirect. So now, our first quibble: Derivatives can sometimes be more reliable than originals.
  40. [40]
    Primary vs. Secondary Sources | Difference & Examples - Scribbr
    Jun 20, 2018 · Primary sources provide raw information and first-hand evidence. Secondary sources interpret, analyze or summarize primary sources.Primary Vs. Secondary... · Examples Of Sources That Can... · Primary Vs Secondary Sources...<|control11|><|separator|>
  41. [41]
    Need advice on unoriginal research : r/PhD - Reddit
    Nov 24, 2022 · Your expectations are far too high. The vast majority of research is 'derivative' by it's very nature. Make sure your supervisor helps with the ...Trying to come up with my own ideas for research; my friend says ...Why is it an insult to call a Scientist's work "derivative"? - RedditMore results from www.reddit.com
  42. [42]
    What is Scientific Research? - Nerac, Inc.
    May 14, 2024 · Scientific research is the systematic investigation of natural phenomena through observation, experimentation, and analysis to generate new knowledge.
  43. [43]
    What is Scientific Research and How Can it be Done? - PMC - NIH
    Research conducted for the purpose of contributing towards science by the systematic collection, interpretation and evaluation of data.
  44. [44]
    Karl Popper: Falsification Theory - Simply Psychology
    Jul 31, 2023 · Karl Popper's theory of falsification contends that scientific inquiry should aim not to verify hypotheses but to rigorously test and identify conditions under ...
  45. [45]
    Falsifiability - Karl Popper's Basic Scientific Principle - Explorable.com
    Falsifiability, according to the philosopher Karl Popper, defines the inherent testability of any scientific hypothesis.
  46. [46]
    Five Characteristics Of The Scientific Method - Sciencing
    Aug 30, 2022 · The five key characteristics of the scientific method are: empirical, replicable, provisional, objective, and systematic.<|separator|>
  47. [47]
    Chapter 1 Science and Scientific Research - Lumen Learning
    Scientific research involves continually moving back and forth between theory and observations. Both theory and observations are essential components of ...
  48. [48]
    Steps of the Scientific Method - Science Buddies
    The six steps are: ask a question, background research, construct a hypothesis, experiment, analyze data, and communicate results.
  49. [49]
    The scientific method (article) - Khan Academy
    1. Make an observation. · 2. Ask a question. · 3. Propose a hypothesis. · 4. Make predictions. · 5. Test the predictions. · 6. Iterate.
  50. [50]
    Science has been in a “replication crisis” for a decade. Have ... - Vox
    Oct 14, 2020 · The replication crisis has led a few researchers to ask: Is there a way to guess if a paper will replicate? A growing body of research has found ...
  51. [51]
    'Publish or perish' culture blamed for reproducibility crisis - Nature
    Jan 20, 2025 · Nearly three-quarters of biomedical researchers think there is a reproducibility crisis in science, according to a survey published in November.
  52. [52]
    What the replication crisis means for intervention science - PMC
    Central to concerns in the replication crisis is a demand for larger data sets for maximizing statistical power while simultaneously questioning the incentive ...
  53. [53]
    Arrendale Library: Empirical & Non-Empirical Research
    This guide provides an overview of empirical research and quantitative and qualitative social science research methods; non-empirical research is defined also..
  54. [54]
    (PDF) Empirical and Non-Empirical Methods - ResearchGate
    The dividing line between empirical and non-empirical methods is marked by scholars' approach to knowledge gain (ie, epistemology).
  55. [55]
    Types of Research - Research Methods
    Apr 5, 2023 · Non-empirical Studies do not require researchers to collect first-hand data. Video: Experimental Research Methods.Missing: forms | Show results with:forms
  56. [56]
    Is mathematical truth empirical? - Philosophy Stack Exchange
    Apr 20, 2024 · Mathematics is not empirical: A mathematical proof is independent from any experience. It only relies on the correct application of the usual or a refined ...Theories in science that make claims that are not empirical in natureAre there non-empirical kinds of aposteriori justifcation?More results from philosophy.stackexchange.com
  57. [57]
    Philosophy of Mathematics
    Sep 25, 2007 · Philosophy of mathematics is concerned with problems that are closely related to central problems of metaphysics and epistemology.Philosophy of Mathematics... · Four schools · Platonism · Special Topics
  58. [58]
    Non-empirical methods for ethics research on digital technologies in ...
    Aug 9, 2024 · This systematic journal review analyses the reporting of ethical frameworks and non-empirical methods in argument-based research articles on digital ...
  59. [59]
    Non-Empirical But Scientific | Richard Dawid - Inference Review
    Ellis's main concern is not about string theory but the multiverse scenarios of cosmic inflation and Everettian quantum mechanics.
  60. [60]
    Poincare's Philosophy of Mathematics
    In short, a priori intuition supplies the non-empirical content of mathematics. Mathematics has a distinctive subject matter, but that subject matter is not ...
  61. [61]
    What is Basic Research? Insights from Historical Semantics - PMC
    But what exactly is basic research? What is the difference between basic and applied research? This article seeks to answer these questions by applying ...
  62. [62]
    Basic vs. Applied Research: What's the Difference? | Indeed.com
    Jul 24, 2025 · Basic research focuses on the advancement of knowledge, rather than solving a problem. Applied research directs its efforts toward finding a solution to a ...Missing: funding | Show results with:funding
  63. [63]
    [PDF] The Endless Frontier - 75th Anniversary Edition
    “...basic research is the pacemaker of technological progress.” That statement is as relevant today as it was in 1945 when Vannevar Bush wrote.
  64. [64]
    RIP: The Basic/Applied Research Dichotomy
    Bush's separation of research into “basic” and “applied” domains has been enshrined in much of US science and technology policy over the past seven decades.
  65. [65]
    Analysis of Federal Funding for Research and Development in 2022
    Aug 15, 2024 · National Patterns estimates show that in 2022 40% and 37% of basic research is funded by the federal government and businesses, respectively.Missing: outcomes | Show results with:outcomes
  66. [66]
    Tracing Long-Term Outcomes of Basic Research Using Citation ...
    Sep 7, 2020 · For many years, U.S. federal science agencies such as the National Institutes of Health have demonstrated the impact of the research they ...<|control11|><|separator|>
  67. [67]
    Quantifying advances from basic research to applied research in ...
    We develop a methodology that indexes levels of advancement from basic research to applied research based on large-scale text data.Missing: NSF | Show results with:NSF
  68. [68]
    in Brief | Returns to Federal Investments in the Innovation System ...
    ... data and macro-models, demonstrates that basic research generates results that make applied research more productive. This in turn has significant positive ...
  69. [69]
    A Historical Perspective on the Distinction Between Basic and ... - jstor
    May 11, 2017 · Abstract The traditional distinction between basic ("pure") and applied science has been much criticized in recent decades.
  70. [70]
    Should We Abandon the Distinction Between Basic and Applied ...
    Jul 12, 2022 · by Stuart Buck. For many decades, science policy and funding have made a distinction between “basic” and “applied” research.
  71. [71]
    Lost in Translation—Basic Science in the Era of ... - PubMed Central
    In Bush's view, basic research would be performed by academia, and applied research would be performed largely by industry and government facilities (11). The ...
  72. [72]
    Applied research won't flourish without basic science - PMC
    Sep 10, 2024 · Without an equal and ongoing commitment to basic science, there would soon be nothing to translate into applications.Missing: data | Show results with:data
  73. [73]
    How to Conduct Scientific Research? - PMC - NIH
    Jun 1, 2017 · Scientific research involves systematic methods, starting with a question and hypothesis, testing it, data analysis, and reevaluation of ...Missing: key | Show results with:key
  74. [74]
    Steps in the Scientific Process - GLOBE.gov
    The steps are: 1) Observe nature; 2) Ask questions and develop hypothesis; 3) Plan and conduct an investigation; 4) Analyze and interpret data; 5) Construct ...
  75. [75]
    Basic Steps in the Research Process
    Step 1: Identify and develop your topic · Step 2 : Do a preliminary search for information · Step 3: Locate materials · Step 4: Evaluate your sources · Step 5: Make ...
  76. [76]
    The Scientific Method - University of Nevada, Reno Extension
    STEP 1. Make an OBSERVATION · STEP 2. Define the PROBLEM · STEP 3: Form the HYPOTHESIS · STEP 4: Conduct the EXPERIMENT · STEP 5: Derive a THEORY.Missing: key | Show results with:key<|separator|>
  77. [77]
    Steps in the Research Process - University System of Georgia
    A list of ten steps · STEP 1: Formulate your question · STEP 2: Get background information · STEP 3: Refine your search topic · STEP 4: Consider your resource ...
  78. [78]
    Research Methods | Definitions, Types, Examples - Scribbr
    Rating 5.0 (5,629) Research methods are ways of collecting and analyzing data. Common methods include surveys, experiments, interviews, and observations.
  79. [79]
    Quantitative, Qualitative, and Mixed Research - Sage Publishing
    Mixed research (or mixed methods research) involves the mixing of quantitative and qualitative research methods, approaches, or other paradigm characteristics.
  80. [80]
    Experimental Research | Educational Research Basics by Del Siegle
    Experimental research manipulates the independent variable and randomly assigns individuals to treatment categories, unlike quasi-experimental research.
  81. [81]
    5.2 Experimental Design – Research Methods in Psychology
    Almost every experiment can be conducted using either a between-subjects design or a within-subjects design.
  82. [82]
    Qualitative vs Quantitative Research: What's the Difference?
    May 16, 2025 · Qualitative research deals with words, meanings, and experiences, while quantitative research deals with numbers and statistics.What Is Qualitative Research? · Advantages · What Is Quantitative Research?
  83. [83]
    Pros And Cons Of Qualitative Research vs Quantitative Research
    Sep 9, 2020 · Larger sample sizes · Impartiality and accuracy of data · Faster and easier to run · Data is anonymous · Offers reliable and continuous information.The difference between... · So when can qualitative and...
  84. [84]
    What are qualitative, quantitative and mixed research methods?
    Qualitative research studies collect descriptive data (mostly words). They use methods such as interviews, focus groups, content analysis, and observation.
  85. [85]
    What Is a Case Study? | Definition, Examples & Methods - Scribbr
    Rating 4.0 (4,528) May 8, 2019 · Case studies tend to focus on qualitative data using methods such as interviews, observations, and analysis of primary and secondary sources ( ...When to do a case study · Step 1: Select a case · Step 3: Collect your data
  86. [86]
    The case study approach - PMC - PubMed Central - NIH
    The case study approach allows in-depth, multi-faceted explorations of complex issues in their real-life settings.
  87. [87]
    What Is Qualitative vs. Quantitative Study? - National University
    Apr 27, 2023 · Strengths and Limitations: Qualitative research allows deeper insights but risks bias and limited scope; quantitative research provides ...
  88. [88]
    Mixed Methods Research - Harvard Catalyst
    Mixed methods strategically integrates or combines rigorous quantitative and qualitative research methods to draw on the strengths of each.
  89. [89]
    Mixed-Methods Research: Combining Qualitative and Quantitative ...
    Jul 25, 2025 · Mixed-methods research combines qualitative and quantitative methods within a single research project to answer the same overarching research ...Example of a Mixed-Methods... · Why Use Mixed-Methods?
  90. [90]
    3.3 Correlational and Experimental Research
    The experimental method is the only research method that can measure cause and effect relationships between variables. Three conditions must be met in order ...Correlational Research · Understanding Correlation · Developmental Research
  91. [91]
    Strengths and limitations | Better Thesis
    Quantitative approaches are best used to answer what, when and who questions and are not well suited to how and why questions. Qualitative data are usually ...
  92. [92]
    Strengths and Limitations of Qualitative and Quantitative Research ...
    Oct 18, 2017 · Both methodologies offer a set of methods, potentialities and limitations that must be explored and known by researchers. This paper concisely ...
  93. [93]
  94. [94]
    Lab Instruments in 2025: Everything You Need to Know | - Scispot
    May 31, 2025 · Lab instruments include medical (e.g., hematology analyzers), analytical (e.g., chromatography), microbiology (e.g., autoclaves), and chemistry ...
  95. [95]
  96. [96]
    How Scientific Tools Power the Life Sciences Industry
    May 21, 2025 · Accuracy & Precision: Instruments such as spectrometers, microscopes, and measuring devices let scientists collect data with high accuracy and ...
  97. [97]
    8 Top Tools For Data Analysis In Research Example - Insight7
    To achieve insightful data interpretation, consider tools like Python with Pandas and NumPy, which facilitate robust data manipulation. Similarly, RStudio ...<|separator|>
  98. [98]
    Quantitative Analysis Tools - Data Analysis - Research - Guides
    Oct 1, 2025 · R is a free, open-source coding program that can be used for data cleaning, analysis (including statistics), and visualization.
  99. [99]
    Top 15 Data Analytics Tools You Should Be Using in 2025
    Jun 11, 2025 · Visual tools such as Power BI, Tableau, and Looker Studio help users present and explore data without writing code. Tools like Python, R, and ...
  100. [100]
    15 Data Analysis Tools and When to Use Them - Coursera
    May 9, 2025 · In this article, we will explore 15 data analysis tools and software, how they differ, and how you can showcase related skills to potential employers.
  101. [101]
    The 11 Best Technology Tools for Researchers - Atlas of Science
    Feb 28, 2020 · 1. Zotero. This free tool wants to be “your personal research assistant.” It's a free-to-use citation manager that helps you collect, organize, ...
  102. [102]
    6 Essential IT Tools for Researchers | Information Technology - Pitt IT
    May 15, 2024 · 1. Collect Participant Data (Qualtrics) · 2. Document and Manage Lab Activities (LabArchives) · 3. Store Large Data Sets (Enterprise Cloud Storage).
  103. [103]
    AI for research: the ultimate guide to choosing the right tool - Nature
    Apr 7, 2025 · Here, Nature explores how academics and students can harness AI to streamline various parts of the research process.<|separator|>
  104. [104]
    10+ Scientific AI Tools Every Scientist Should Know in 2025/26
    Jul 23, 2025 · This article explores more than 10 essential AI tools that are shaping the future of hypothesis generation, molecular modeling, data analysis, and beyond.
  105. [105]
    The 2025 AI Index Report | Stanford HAI
    Meanwhile, a growing body of research confirms that AI boosts productivity and, in most cases, helps narrow skill gaps across the workforce.Responsible AI · Status · Research and Development · The 2023 AI Index Report
  106. [106]
    The Belmont Report | HHS.gov
    Aug 26, 2024 · Ethical Principles and Guidelines for the Protection of Human Subjects of Research. The Belmont Report was written by the National ...
  107. [107]
    Singapore Statement on Research Integrity
    The Singapore Statement on Research Integrity is intended to challenge governments, organizations and researchers to develop more comprehensive standards.
  108. [108]
    Read the Belmont Report | HHS.gov
    Jul 15, 2025 · Part B: Basic Ethical Principles · 1. Respect for Persons. · 2. Beneficence. · 3. Justice.Ethical Principles and... · Basic Ethical Principles · Applications
  109. [109]
    What Is Ethics in Research & Why Is It Important?
    ... essential to collaborative work, such as trust, accountability, mutual respect, and fairness. For example, many ethical norms in research, such as guidelines ...
  110. [110]
    Five principles for research ethics
    Jan 1, 2003 · 1. Discuss intellectual property frankly · 2. Be conscious of multiple roles · 3. Follow informed-consent rules · 4. Respect confidentiality and ...
  111. [111]
    Guiding Principles for Ethical Research - NIH
    Jun 10, 2025 · Social and clinical value · Scientific validity · Fair subject selection · Favorable risk-benefit ratio · Independent review · Informed consent.
  112. [112]
    Definition of Research Misconduct | ORI
    Research misconduct means fabrication, falsification, or plagiarism in proposing, performing, or reviewing research, or in reporting research results.
  113. [113]
    What Is Research Misconduct - NIH Grants & Funding
    Aug 19, 2024 · Research misconduct means fabricating, falsifying, and/or plagiarizing in proposing, performing, or reviewing research, or in reporting research results.
  114. [114]
    RCR Casebook: Research Misconduct | ORI
    only three fall within the federal definition of “research misconduct”: Falsification, fabrication, and plagiarism—commonly known as “FFP ...
  115. [115]
    Misconduct in Biomedical Research: A Meta-Analysis and ...
    The prevalence of research misconduct for plagiarism was 4.2% for self-reported and 27.9% for nonself-reported studies. Data fabrication was 4.5% in self- ...
  116. [116]
    Landmark research integrity survey finds questionable practices are ...
    Jul 7, 2021 · And one in 12 admitted to committing a more serious form of research misconduct within the past 3 years: the fabrication or falsification of ...
  117. [117]
    A survey among academic researchers in The Netherlands
    Feb 16, 2022 · Prevalence of QRPs ranged from 0.6% (95% CI: 0.5, 0.9) to 17.5% (95% CI: 16.4, 18.7) with 51.3% (95% CI: 50.1, 52.5) of respondents engaging ...
  118. [118]
    Misconduct accounts for the majority of retracted scientific publications
    67.4% of retractions were attributable to misconduct, including fraud or suspected fraud (43.4%), duplicate publication (14.2%), and plagiarism (9.8%).
  119. [119]
    Reasons and Types of Research Misconduct - Researcher.Life
    Apr 3, 2023 · Research misconduct arises from various factors such as pressure to publish, competition for grants, and career advancement. Lack of supervision ...Types of research misconduct · Reasons for committing...
  120. [120]
    Deceiving scientific research, misconduct events are possibly ... - NIH
    Aug 23, 2022 · ... Scientific fraud, Research misconduct and respect. Background. Scientists are under great pressure to publish not only high-quality research ...
  121. [121]
    Why Is There So Much Fraud in Academia? - Freakonomics
    Jan 10, 2024 · Episode 572. Why Is There So Much Fraud in Academia? Some of the biggest names in behavioral science stand accused of faking their results.
  122. [122]
    Detailed Case Histories - Fostering Integrity in Research - NCBI - NIH
    The following five detailed case histories of specific cases of actual and alleged research misconduct are included in an appendix to raise key issues and ...PAXIL CASE · THE HWANG STEM CELL... · THE TRANSLATIONAL OMICS...
  123. [123]
    Data integrity scandals in biomedical research: Here's a timeline
    May 17, 2023 · An NIH review uncovered “wide-ranging scientific misconduct,” concluding that Darsee fabricated data from fictional experiments. Consequently, ...
  124. [124]
    The consequences of retraction: Do scientists forgive and forget?
    Jun 16, 2015 · We find that eminent scientists are more harshly penalized than their less-distinguished peers in the wake of a retraction, but only in cases ...
  125. [125]
    Characterizing the effect of retractions on publishing careers - Nature
    Apr 11, 2025 · Retracting academic papers is a fundamental tool of quality control, but it may have far-reaching consequences for retracted authors and their careers.
  126. [126]
    Fraud in research and the tangible cost of retractions - Morressier
    Sep 14, 2022 · Retractions severely damage the prestige of scholarly publications, leading to fewer citations, lower impact factors, and fewer submissions.
  127. [127]
    Interventions to prevent misconduct and promote integrity in ...
    We identified a range of interventions aimed at reducing research misconduct. Most interventions involved some kind of training, but methods and content varied ...
  128. [128]
    Case Summaries | ORI - The Office of Research Integrity
    This page contains cases in which administrative actions were imposed due to findings of research misconduct.Missing: notable | Show results with:notable
  129. [129]
    A simple — but not easy — way to stop research misconduct | STAT
    Sep 23, 2024 · Experts have proposed various solutions to the problem of research misconduct, ranging from requiring training and mentorship in research ethics ...
  130. [130]
    Institutional capacity to prevent and manage research misconduct
    Jul 12, 2023 · Prevention of misconduct involves awareness creation about misconduct and its consequences, training in the broad concepts of responsible ...
  131. [131]
    The History and Role of Institutional Review Boards: A Useful Tension
    Apr 1, 2009 · Institutional review boards (IRBs) play a role in approving research that involves human subjects.
  132. [132]
    Beginner's Guide to institutional review boards (IRBs) - Advarra
    Feb 24, 2022 · An IRB is an independent group that reviews and monitors human subject research to protect participant rights and welfare.
  133. [133]
    History of IRB - Committee For the Protection of Human Subjects
    The recognition of the need for guidelines dealing with human subjects in research emerged following the Nuremberg trials.
  134. [134]
    Institutional Review Boards: Purpose and Challenges - PMC
    IRBs have an important role in protecting human research participants from possible harm and exploitation. Independent review by an IRB or equivalent is an ...
  135. [135]
    The IACUC | OLAW - NIH
    Oct 30, 2024 · The IACUC is responsible for oversight of the animal care and use program and its components as described in the Public Health Service (PHS) Policy.Semiannual Program Reviews... · Semiannual Report to the... · Protocol Review
  136. [136]
    [PDF] Institutional Animal Care and Use Committee Guidebook
    role of the Institutional Animal Care and Use Committee (IACUC) in ensuring the ethical and sensitive care and use of animals in research, teaching and ...
  137. [137]
    9 CFR 2.31 -- Institutional Animal Care and Use Committee (IACUC).
    (a) The Chief Executive Officer of the research facility shall appoint an Institutional Animal Care and Use Committee (IACUC), qualified through the experience ...
  138. [138]
    About ORI - The Office of Research Integrity
    The Office of Research Integrity (ORI) oversees and directs Public Health Service (PHS) research integrity activities on behalf of the Secretary of Health ...Missing: functions | Show results with:functions
  139. [139]
    ORI - The Office of Research Integrity | ORI - The Office of Research ...
    ORI released three notice of funding opportunity announcements for projects, conferences, and innovative tools that advance the evolving field of research ...About ORI · Research Misconduct · Integrity in Scientific Research · ORI Leadership
  140. [140]
    [PDF] THE OFFICE OF RESEARCH INTEGRITY - Hofstra Law
    I. INTRODUCTION. The Office of Research Integrity (“ORI”) is a statutory office created by Congress in 1993,1 for which authority to respond to.
  141. [141]
    The Abject Failure of IRBs - The Chronicle of Higher Education
    Mar 23, 2022 · The Abject Failure of IRBs: Ethics-review systems have become exercises in absurdity and unpredictability.
  142. [142]
    Information about IRBs and Oral History
    Federal Institutional Review Board (IRB) oversight policies for the protection of human subjects in research have been an issue of great concern to oral ...
  143. [143]
    IRBs as compliance bureaucracy? A review of Regulating Human ...
    Jun 18, 2021 · This overreach, as Babb characterizes it, led to things like requirements for signed informed consent and other obligations that ...Missing: Criticisms | Show results with:Criticisms
  144. [144]
    Oversight of Dual-Use Research: What Role for Ethics Committees?
    The purpose of this study was to examine the role of national ethics committees in the context of governance and oversight of dual-use research at the national ...
  145. [145]
    Estimating the reproducibility of psychological science
    Aug 28, 2015 · We conducted a large-scale, collaborative effort to obtain an initial estimate of the reproducibility of psychological science.
  146. [146]
    A New Replication Crisis: Research that is Less Likely to be True is ...
    May 21, 2021 · Papers that cannot be replicated are cited 153 times more because their findings are interesting, according to a new UC San Diego study.<|separator|>
  147. [147]
    Are replication rates the same across academic fields? Community ...
    Jul 22, 2020 · We show that participants expect replication rates to increase over time. Moreover, they expect replication rates to differ between fields.
  148. [148]
    How big of an issue is the replication crisis in physics? - Reddit
    Aug 17, 2024 · The "replication crisis" is the discovery that in a lot of fields, vast quantities of research can't be replicated at all. This is blamed on the ...Why is the "Replication Crisis" not talked about more? Why is it not a ...TIL of the “replication crisis”, the fact that a surprisingly large percent ...More results from www.reddit.com
  149. [149]
    The Replication Crisis: the Six P's | Alex Holcombe's blog
    Jul 23, 2023 · Professor Dorothy Bishop came up with “the four horsemen of irreproducibility“: publication bias, low statistical power, p-hacking, and HARKing.
  150. [150]
    The curious case of the reproducibility crisis - Polytechnique Insights
    Specifically, two of the most common “bad research” practices responsible for non-replicable results are due to statistical manipulation: p‑hacking and HARKing.
  151. [151]
    A Simple Explanation for the Replication Crisis in Science
    Aug 23, 2016 · The reasons proposed for this crisis are wide ranging, but typical center on the preference for “novel” findings in science and the pressure on investigators.
  152. [152]
    'An Existential Crisis' for Science - Institute for Policy Research
    Feb 28, 2024 · The replication crisis is when scientists can't get the same results as previous studies, threatening the reliability of scientific results.
  153. [153]
    An Overview of Scientific Reproducibility: Consideration of Relevant ...
    Vox's Resnick (2018) has argued that the replication crisis contributes to the distribution of misinformation to budding behavior science students due to ...<|separator|>
  154. [154]
    From the “Replicability Crisis” to Open Science Practices
    One response to this “replicability crisis” has been the emergence of open science practices, which increase the transparency and openness of the research ...
  155. [155]
    How Psychological Study Results Have Changed Since the ...
    May 21, 2025 · Since the replication crisis, psychological studies show stronger results, larger sample sizes, and fewer results narrowly meeting significance ...
  156. [156]
    We Should Do More Direct Replications in Science
    Jul 31, 2024 · I would contend that direct replication of experiments in psychology, medicine, biology, economics, and many other fields, is highly useful ...
  157. [157]
    Study Bias - StatPearls - NCBI Bookshelf
    In academic research, bias refers to a type of systematic error that can distort measurements and/or affect investigations and their results.
  158. [158]
    Types of Bias in Research | Definition & Examples - Scribbr
    Rating 5.0 (291) Observer bias arises from the opinions and expectations of the observer, influencing data collection and recording, while actor–observer bias has to do with how ...
  159. [159]
    Humans actively sample evidence to support prior beliefs - PMC
    Confirmation bias is defined as the tendency of agents to seek out or overweight evidence that aligns with their beliefs while avoiding or underweighting ...
  160. [160]
    10 Types of Study Bias - Science | HowStuffWorks
    Mar 7, 2024 · Confirmation Bias · Sampling Bias · Selection Bias · Channeling Bias · Question-Order bias · Interviewer Bias · Recall Bias · Acquiescence Bias ...Confirmation Bias · Sampling Bias · Publication Bias · File Drawer Bias
  161. [161]
    Publication Bias in Meta-Analyses from Psychology and Medicine
    Publication bias is a substantial problem for the credibility of research in general and of meta-analyses in particular, as it yields overestimated effects ...
  162. [162]
    Many Results in Psychology and Medicine Are False Positives
    May 27, 2024 · Only 27.3% of the results in psychology and only 5.3% of the results in medicine have “strong evidence” for being true.
  163. [163]
    Footprint of publication selection bias on meta‐analyses in medicine ...
    Feb 7, 2024 · Publication selection bias, where studies with significant or positive results are more likely to be reported and published, distorts the ...
  164. [164]
    Industry Sponsorship bias | Catalog of Bias
    Industry sponsorship bias is a tendency for studies to support the funding organization's interests, skewing results to promote commercial interests.
  165. [165]
    The Influence of Industry Sponsorship on the Research Agenda - NIH
    Data from several fields have shown biases in the design, conduct, and publication of research that are related to industry funding sources. For example, ...
  166. [166]
    Types of Bias - Information Literacy In Real Life (IL IRL)
    Sep 10, 2025 · A real-life example of funding bias is a study was published in a scientific journal that found drinks including high-fructose corn syrup did ...
  167. [167]
    Harvard Faculty Survey Reveals Striking Ideological Bias, But More ...
    Jul 27, 2022 · Only 16 percent of Harvard faculty members classify their political views as “moderate," and just one percent indicate that their views are ...
  168. [168]
    Political Biases in Academia | Psychology Today
    May 29, 2020 · A list of mostly peer-reviewed articles and academic books and chapters addressing the problem of political bias in academia.
  169. [169]
    The Hyperpoliticization of Higher Ed: Trends in Faculty Political ...
    Higher education has recently made a hard left turn—sixty percent of faculty now identify as “liberal” or “far left.” This left-leaning supermajority is ...
  170. [170]
    Bias in Research | Types, Identifying & Avoiding - ATLAS.ti
    Response bias: This happens when participants in a study respond inaccurately or falsely, often due to misleading or poorly worded questions. Observer bias (or ...
  171. [171]
    Publication bias | Catalog of Bias - The Catalogue of Bias
    Dickersin & Min define publication bias as the failure to publish the results of a study “on the basis of the direction or strength of the study findings.” This ...
  172. [172]
    CMV: academia isn't biased towards left-wing politics, facts are
    Jul 27, 2021 · Academia is biased towards the left in the sense that academics, especially in relevant fields, are more likely to hold left-wing views.Is it me or are political biases in school/college/university a ... - RedditAcademic left wing bias : r/IntellectualDarkWeb - RedditMore results from www.reddit.com
  173. [173]
    Social Justice and Indoctrination: Views of Faculty Accused of Bias
    Oct 15, 2024 · This study explores the discourse surrounding ideological bias in American higher education, particularly focusing on perceptions of bias ...Missing: statistics | Show results with:statistics<|separator|>
  174. [174]
    The Peer Review Process: Past, Present, and Future
    Jun 16, 2024 · Studies in which intentional flaws were introduced into papers found that most errors are missed. In a striking 1998 study, Godlee, Gale and ...
  175. [175]
    Peer review: a flawed process at the heart of science and journals
    So peer review is a flawed process, full of easily identified defects with little evidence that it works. Nevertheless, it is likely to remain central to ...
  176. [176]
    Publication bias - Importance of studies with negative results! - NIH
    Publication bias is defined as the failure to publish the results of a study on the basis of the direction or strength of the study findings.
  177. [177]
    The effect of publication bias magnitude and direction on the ... - NIH
    Publication bias occurs when studies with statistically significant results have increased likelihood of being published. Publication bias is commonly ...
  178. [178]
    Impact of institutional affiliation bias in the peer review process
    Mar 11, 2025 · Issues such as institutional affiliations, editor-author friendship, paradigm confirmation or theory support and networks have been raised ...<|separator|>
  179. [179]
    Ideological biases in research evaluations? The case of research on ...
    May 23, 2022 · Our interpretation is that researchers use information that is irrelevant to evaluate the quality and importance of a study's research design.Abstract · INTRODUCTION · THE SURVEY EXPERIMENT · CONCLUSION
  180. [180]
    Ideological Gatekeeping and the Future of Peer Review
    Sep 30, 2020 · In short, it's nothing new that editors have ways to manipulate the peer review process when they desire. Sometimes, bias manifests ...
  181. [181]
    Biomedical paper retractions have quadrupled in 20 years — why?
    May 31, 2024 · The retraction rate for European biomedical-science papers increased fourfold between 2000 and 2021, a study of thousands of retractions has found.
  182. [182]
    Linking citation and retraction data reveals the demographics of ...
    The number of retracted papers per year is increasing, with more than 10,000 papers retracted in 2023 [6].
  183. [183]
    The peer-review crisis: how to fix an overloaded system - Nature
    Aug 6, 2025 · As pressure on the system grows, many researchers point to low-quality or error-strewn research appearing in journals as an indictment of their ...
  184. [184]
    The limitations to our understanding of peer review
    Apr 30, 2020 · We are often unable to discern whether peer reviews are more about form or matter, whether they have scrutinised enough to detect errors, ...
  185. [185]
    The 'publish or perish' mentality is fuelling research paper retractions
    Oct 3, 2024 · Published research papers can be retracted if there is an issue with their accuracy or integrity. And in recent years, the number of retractions has been ...
  186. [186]
    Scientific Research: The Problem with "Publish or Perish"
    Sep 17, 2024 · ... ” in academia today? The “publish or perish” mindset incentivizes publishing quantity over quality, impacting the quality of research.<|separator|>
  187. [187]
    Perverse Incentives in Science: 21st Century Funding for 20th C ...
    Jun 21, 2017 · Perverse incentives include funding for established research, hypercompetitiveness in grant systems, and scientists spending more time on grant ...
  188. [188]
    [PDF] Stagnation and Scientific Incentives
    Scientific stagnation is caused by changes in scientist incentives, shifting focus from new ideas to incremental science due to emphasis on citations.
  189. [189]
    (PDF) Nonneutralities in Science Funding: Direction, Destabilization ...
    Aug 6, 2025 · It is argued that, while directional effects of funding are ubiquitous, destabilizing and distorting effects are much more likely to emerge ...
  190. [190]
    Science Has a Major Fraud Problem. Here's Why Government ...
    Jan 9, 2024 · 34% percent of scientists receiving federal funding have acknowledged engaging in research misconduct to align research with their funder's ...
  191. [191]
    Incentives and the replication crisis in social sciences: A critical ...
    The replication crisis arises from fundamental flaws in the incentives shaping publication and academic career progression, particularly the strong preference ...
  192. [192]
    Go Forth and Replicate: On Creating Incentives for Repeat Studies
    Sep 11, 2017 · Scientists have few direct incentives to replicate other researchers' work, including precious little funding to do replications.
  193. [193]
    Concerns About Replicability Across Two Crises in Social Psychology
    A Lack of Incentives for Replication Studies. During both crises researchers point out that the incentive structure does not reward replication studies.
  194. [194]
    How Competition for Funding Impacts Scientific Practice - NIH
    Feb 13, 2024 · Researchers across all groups experienced that competition for funding shapes science, with many unintended negative consequences.
  195. [195]
    How Long Does it Take to Get a Ph.D. Degree and Should You Get ...
    Apr 29, 2025 · The Education Data Initiative says it can range from five to 11 years, depending on various factors such as the type of doctorate, the program's ...
  196. [196]
    Time to Degree, Funding, and Completion Rates - NCBI - NIH
    Median time to degree in the biomedical sciences is relatively constant across fields: medians range from 4.88 to 5.73 years for all biomedical science fields.
  197. [197]
    Exploring PhD Completion Rates: How Many Finish on Time?
    Jun 20, 2024 · The data indicated that 57% of the doctoral candidates in the sample completed their degree programs within ten years, and 20% completed them after seven years.
  198. [198]
  199. [199]
    Postdoc life and after - how does it continue? : r/AskAcademia - Reddit
    Feb 3, 2023 · Max 2 years if in the US. 5-6 years as an asst professor , then associate , safe side 20 years minimum out of PhD or colloquially called as ...
  200. [200]
    Skills to learn during the postdoc years - Eva Lantsoght
    Nov 23, 2023 · Postdocs should learn grant writing, lab management, exploring career paths, navigating the job market, working on tighter deadlines, and ...
  201. [201]
    Career progression paths for a Postdoc - Academia Stack Exchange
    Mar 10, 2016 · Postdoc paths include professor roles, pure research (Research Associate, Scientist, Senior Scientist), or tenure track. Some may need multiple ...
  202. [202]
    Too Many PhD Graduates or Too Few Academic Job Openings - NIH
    Nowadays, less than 17% of new PhDs in science, engineering and health-related fields find tenure-track positions within 3 years after graduation (National ...
  203. [203]
    In a first, U.S. private sector employs nearly as many Ph.D.s as ...
    D.s. In 2017, only 23% of these Ph.D.s held a tenured or tenure track position in academia—a drop of 10 percentage points since 1997. Only math and the computer ...
  204. [204]
    What ratio of PhD graduates in CS fields ultimately get tenured ...
    Sep 16, 2024 · In Computer Science, the ratio of PhD graduates becoming tenured professors is about 11.73%, based on a 2022-2023 survey.What ratio of PhD graduates in STEM fields ultimately end up as ...What percentage of phds in math actually get a tenure track ...More results from academia.stackexchange.com
  205. [205]
    Competition for engineering tenure-track faculty positions in the ...
    May 7, 2024 · The average likelihood for securing a tenure-track faculty position for engineering overall during this 16-year period was 12.4% (range = 10.9–18.5%).<|separator|>
  206. [206]
    PhD Jobs: Top Non-Academic Careers for PhD Degree Holders
    21 top non-academic PhD jobs · 1. Publisher · 2. Curriculum leader · 3. Policy analyst · 4. Medical science liaison · 5. Environmental analyst · 6. Equity research ...
  207. [207]
    Exploring 7 Alternative Career Paths for PhDs Beyond Academia
    One of the most popular alternative career paths for PhDs is research and development in the private sector. Corporations and startups across various industries ...
  208. [208]
    How The Academic PhD Job Market Was Destroyed - Cheeky Scientist
    Oversupply Of PhDs & Shrinking Academic Positions. A concerning trend has been the mismatch between the number of PhD graduates and the availability of academic ...
  209. [209]
    Estimation of probabilities to get tenure track in academia
    Sep 20, 2020 · Between 10% and 30% of PhD alumni get a permanent academic position, with a 15-30% baseline chance if wanting to work in academia.Key takeaways · Introduction · Baseline
  210. [210]
    Ph.D. Oversupply: The System Is the Problem - Inside Higher Ed
    Jun 21, 2021 · The jobs crisis is built into institutional structures, and to push past the logjam, universities must improve communication, information and incentives.
  211. [211]
    Science PhDs face a challenging and uncertain future - Ars Technica
    Jun 4, 2025 · At the same time, some analysts have worried about an oversupply of PhDs in some fields, while students have suggested that universities are ...
  212. [212]
    The Origin of the Research University - Asterisk Magazine
    and for almost all of that time, they weren't centers of research. What changed in 19th century ...German universities in the age... · Göttingen and the birth of... · The Romantic turn
  213. [213]
    [PDF] Founding America's First Research University
    Dec 21, 2018 · In 1872, Daniel Gilman, president of the University of California, Berkeley, articulated his vision of what a university should be.Missing: evolution | Show results with:evolution
  214. [214]
    [PDF] The Rise of the Research University: A Sourcebook
    By 1890, public universities began to remake themselves more explicitly in the research university model. They expanded their graduate programs and added ...
  215. [215]
    The Role Of Research At Universities: Why It Matters - Forbes
    Mar 2, 2022 · Universities engage in research as part of their missions around learning and discovery. This, in turn, contributes directly and indirectly to their primary ...
  216. [216]
    4 The Roles of Universities | Trends in the Innovation Ecosystem
    Universities are important sources of many of the new ideas in science and technology that contribute to innovation in the United States.
  217. [217]
    Impact of Achieved Tenure and Promotion on Faculty Research ...
    Critics of the promotion and tenure system contend that promotion and tenure may lead to a decline in research productivity (“dead wood phenomena”) by those ...
  218. [218]
    Out of Balance | AAC&U
    Almost half of Americans view higher education as friendlier to liberals than conservatives when it comes to free speech, according to a 2023 survey by the ...
  219. [219]
    7 steps to publishing in a scientific journal - Elsevier
    These guidelines focus on preparing a full article (including a literature review), whether based on qualitative or quantitative methodology.
  220. [220]
    Editorial process - How to publish a scientific paper
    Aug 19, 2025 · 1. Submission: The manuscript is submitted by the corresponding author, and receives a submission or tracking ID number. 2. Preliminary editorial screening.
  221. [221]
    Publication Output by Region, Country, or Economy and by Scientific ...
    Dec 11, 2023 · In 2022, total worldwide S&E publication output was 3.3 million articles. China had 898,949, and the US had 457,335. Health sciences had the ...
  222. [222]
    Preprints: What Role Do These Have in Communicating Scientific ...
    Preprints are freely accessible documents, not peer-reviewed, to make new knowledge available before traditional validation, and can be an interim step.
  223. [223]
    The evolving role of preprints in the dissemination of COVID-19 ...
    By communicating science through preprints, we are sharing research at a faster rate and with greater transparency than allowed by the current journal ...
  224. [224]
    The Role of Preprints in Journal Publishing: New developments in ...
    Apr 7, 2021 · Many journals are beginning to allow and even encourage preprint posting, and some are pioneering new preprint review and publishing models.
  225. [225]
    Uptake of Open Access (OA) - STM Association
    Between 2014 and 2024, the percentage share of global articles, reviews and conference papers made available via gold has increased by 26%, from 14% to 40%.
  226. [226]
    Open-access revolution is squeezing scientific societies' budgets ...
    Jun 9, 2025 · The trend toward open-access publishing is threatening that income stream even as costs are rising, a recent survey indicates.
  227. [227]
    Open Access Journal Publishing 2024-2028 - The Freedonia Group
    Open access journal revenue is projected to increase at a compound annual rate of 11.5% to reach $3.25 billion by 2028, according to a new Simba report.
  228. [228]
    Conference presentations: A research primer for low- and middle ...
    Presenting research at a conference is an opportunity to disseminate the findings, network with other researchers, and to develop your academic track record.Research Primer · Oral Presentations · Poster Presentations
  229. [229]
    Disseminating Knowledge and Research Findings at Conferences
    The session aimed to help biomedical and health informatics professionals stay current with the most “relevant, interesting, or innovative” papers of the year.
  230. [230]
    Methods to Disseminate Research: A Primer - PMC - PubMed Central
    Jan 1, 2025 · This paper provides an introduction to disseminating/communicating research findings to other research professionals, clinicians, policymakers, and funders/ ...Considerations For... · Journal Publication · Social Media
  231. [231]
    End of Year Edition – Against All Odds, Global R&D Has Grown ...
    Dec 18, 2024 · End of Year Edition – Against All Odds, Global R&D Has Grown Close to USD 3 Trillion in 2023. Global R&D spending has nearly tripled since 2000 ...
  232. [232]
    U.S. and Global Research and Development
    Jan 18, 2022 · In 2019, the United States (27% or $656 billion) and China (22% or $526 billion) performed about half of the global R&D.
  233. [233]
    Research on R&D Funding: The Different Functions of Public and ...
    Sep 8, 2025 · 69.6% of GERD is already financed by private industry in the U.S., with the remainder a mixture of higher education, nonprofits, and foreign ...
  234. [234]
    Federal Research and Development (R&D) Funding: FY2025
    Dec 9, 2024 · President Biden's budget proposal for FY2025 includes approximately $201.9 billion for R&D, $7.4 billion (4%) above the FY2024 estimated level of $194.6 ...
  235. [235]
    These are the biggest spenders on private-sector research - Nature
    Aug 1, 2025 · According to the data, the European Commission awarded at least twice as many individual grants (2,490) as the other top funders. NEDO, by ...<|separator|>
  236. [236]
    Fact of the Week: China and the EU Invest More in Research at ...
    Jun 23, 2025 · In 2023, the United States invested about $175 billion in research conducted at government institutions and universities.
  237. [237]
    A comparative analysis of public R&I funding in the EU, US, and China
    Jun 6, 2025 · This paper compares public R&I funding across the EU, US, and China, the world's largest R&I spenders, over recent years.
  238. [238]
    Innovation Lightbulb: Breaking Down Private Sector Research and ...
    May 22, 2024 · Funding for private sector-performed R&D totaled $602 billion in 2021 across all sectors, accounting for 75 percent of the US total ($806 billion) and with the ...<|separator|>
  239. [239]
    Ranked: The Countries Investing the Most in R&D - Visual Capitalist
    Apr 17, 2025 · Since 2020, OECD countries have spent an average of 2.7% of their GDP on R&D, altogether spending $1.9 trillion in 2023.
  240. [240]
    How universities spend billions in government funds - USAFacts
    May 6, 2025 · In FY 2023, federal dollars supported $59.6 billion of university R&D expenses. The fields that received the most funding in 2023 were life ...
  241. [241]
  242. [242]
    Research Grants | EREF
    Previously awarded grants have ranged from $15,000 to over $500,000 with the average grant amount in recent years being $160,000. Typical project durations are ...Strategic Research Priorities · Our Research · Research Council<|control11|><|separator|>
  243. [243]
    [PDF] Private Foundations that Fund Academic Research: A Quick Guide
    While much of the funding is focused on interventions, they do fund basic research (especially related to disease prevention). Burroughs Wellcome Fund. Supports ...
  244. [244]
    Global R&D and International Comparisons
    Jul 23, 2025 · The higher education sector had double-digit performing shares only in China, Germany, and France among the largest R&D performers, with ...
  245. [245]
    Research and development expenditure (% of GDP) | Data
    Research and development expenditure (% of GDP) · Primary government expenditures as a proportion of original approved budget (%) · Expense (% of GDP) · Tax ...
  246. [246]
    Gross domestic spending on R&D - OECD
    Gross domestic spending on research and development (R&D) is the total expenditure (current and capital) on R&D in a country.
  247. [247]
    Publication Output by Geography and Scientific Field
    Jul 23, 2025 · In 2023, four countries each produced more than 100,000 articles: China, the United States, India, and Germany. Together, they accounted for ...
  248. [248]
    2024 Research Leaders: Leading countries/territories | Nature Index
    The top 3 research leaders in 2024 are China, USA, and Germany, based on Nature Index data from 2023.
  249. [249]
    The United States Continues to Lead High-Impact Scientific ... - WIPO
    Jul 31, 2025 · In 2024, the United States remains firmly in first place in terms of the quality of scientific publications. What's more, this US leadership is ...
  250. [250]
    Researchers In R&D (per Million People) By Country
    This page lists countries by researchers in R&D per million people. For example, the UAE has 2607, Austria has 6659, and the world has 1516.<|separator|>
  251. [251]
    R&D expenditure - Statistics Explained - Eurostat
    Sep 25, 2025 · In 2023, EU research and development expenditure relative to GDP stood at 2.26%, higher than in the previous year when it recorded 2.22%.Highlights · R&D expenditure by sector of... · R&D expenditure by source of...<|separator|>
  252. [252]
    SJR - International Science Ranking - Scimago
    Country, Documents · Citable documents · Citations · Self-Citations · Citations per Document · H index. 1, US, United States. 16963549.
  253. [253]
    U.S. R&D Totaled $892 Billion in 2022; Estimate for 2023 Indicates ...
    Feb 27, 2025 · Of the $892 billion total, the business sector funded $673 billion and the federal government funded $164 billion. Downloads.
  254. [254]
    Public and Private R&D Are Complements—Not Substitutes - CSIS
    Aug 20, 2025 · Over the past several decades, private sector R&D investment has grown rapidly, far outpacing federal spending in dollar terms. This is good ...
  255. [255]
    Public- And Private-Sector Contributions to the Research ... - PubMed
    The private sector was also dominant in achieving the major milestones for both the production and drug development phases (81% and 73% of the drugs reviewed, ...
  256. [256]
    A Closer Look at US Private Sector R&D Spending in a Global Context
    Feb 9, 2024 · In 2022, U.S. firms' R&D spending ranked sixth at 2.57 percent of GDP, behind firms in Switzerland (4.75 percent), Taiwan (3.51 percent), Japan ...
  257. [257]
    Global Innovation Index 2024: Analyzing global R&D trends with the ...
    Sep 26, 2024 · Global R&D spending: In 2023, the global corporate R&D expenditure real growth of 6.1% was slower than the 7.5% real growth rate in 2022.
  258. [258]
    Highly accurate protein structure prediction with AlphaFold - Nature
    Jul 15, 2021 · AlphaFold greatly improves the accuracy of structure prediction by incorporating novel neural network architectures and training procedures ...
  259. [259]
    The impact of AlphaFold Protein Structure Database on ... - PubMed
    In 2021, DeepMind and EMBL-EBI developed the AlphaFold Protein Structure Database to make an unprecedented number of reliable protein structure predictions ...
  260. [260]
    Great expectations – the potential impacts of AlphaFold DB | EMBL
    Jul 22, 2021 · The protein-structure predictions in AlphaFold DB will have an immediate impact on molecular structural biology research, and in a longer ...
  261. [261]
    Before and after AlphaFold2: An overview of protein structure ...
    Feb 27, 2023 · In this mini-review, we provide an overview of the breakthroughs in protein structure prediction before and after AlphaFold2 emergence.Structure prediction methods · AlphaFold · New methods of protein... · Conclusion
  262. [262]
    Elicit: AI for scientific research
    Use AI to search, summarize, extract data from, and chat with over 125 million papers. Used by over 2 million researchers in academia and industry.
  263. [263]
    AI-Based Literature Review Tools - Research Guides - LibGuides
    Oct 13, 2025 · Selected AI-Based Literature Review Tools · AI-POWERED RESEARCH ASSISTANT - finding papers, filtering study types, automating research flow, ...
  264. [264]
    Best AI Research Tools for Academics and Researchers - Litmaps
    AI research tools are software applications that assist with various research stages, such as literature reviews, data analysis, writing, and collaboration.Categories Of Ai Research... · Best Ai Tools For Literature... · Best Ai Data Analysis Tools<|separator|>
  265. [265]
    Generative artificial intelligence in public health research and ...
    Aug 18, 2025 · This narrative review synthesised 18 recent peer-reviewed and grey literature (2023–2025) to explore the role of GenAI in public health research ...
  266. [266]
    AI-enabled scientific revolution in the age of generative AI - Nature
    Aug 11, 2025 · Recent advances in generative AI allow for the creation of more expressive and adaptive simulators that can better capture system complexity and ...
  267. [267]
    The Latest AI News and AI Breakthroughs that Matter Most: 2025
    Summary: MIT researchers have developed a generative AI system named FlowER (Flow matching for Electron Redistribution) that predicts chemical reactions while ...Missing: 2023-2025 | Show results with:2023-2025<|separator|>
  268. [268]
    What is reproducibility in artificial intelligence and machine learning ...
    Apr 18, 2025 · Reproducibility challenges include incomplete documentation of training data, limited access to computational infrastructure, and difficulties ...
  269. [269]
    Artificial intelligence: help or hindrance in solving the reproducibility ...
    Jun 20, 2024 · Many trials in the AI space have limitations in their study design, a high risk of bias, and limited transparency in the availability of data or ...
  270. [270]
    Bias in medical AI: Implications for clinical decision-making - NIH
    Nov 7, 2024 · Left unaddressed, biased medical AI can lead to substandard clinical decisions and the perpetuation and exacerbation of longstanding healthcare ...
  271. [271]
    Bias in artificial intelligence for medical imaging
    Bias in medical AI is systematic error, internalized by AI, causing a distance between prediction and truth, potentially harming patients.<|control11|><|separator|>
  272. [272]
    Artificial intelligence in scientific research: Challenges, opportunities ...
    AI offers opportunities like accelerated data analysis, but also challenges such as bias and reproducibility issues. Human expertise is essential for ...
  273. [273]
    AI pitfalls and what not to do: mitigating bias in AI - PMC - NIH
    Reproducibility of results is another concern that must be addressed. Failure to provide transparent documentation and detailed methodologies especially for ...
  274. [274]
    FAIR Principles
    The principles refer to three types of entities: data (or any digital object), metadata (information about that digital object), and infrastructure.How to GO FAIR - GO FAIR · I2: (Meta)data use · FAIRification Process
  275. [275]
    The EUA Open Science Agenda 2025
    The EUA Open Science Agenda 2025 defines the Association's priorities in this field and describes the current context, challenges and developments envisaged ...<|separator|>
  276. [276]
    With 44% of its published articles now open access (OA), Springer ...
    Aug 8, 2024 · 44% of the publisher's primary research is now published OA in its hybrid and fully OA journals, up from 38% in 2022.<|separator|>
  277. [277]
    In bid to expand, bioRxiv and medRxiv preprint servers move to ...
    Mar 11, 2025 · ... preprints on medRxiv dropped from 80% to just 7%. But growth in preprints about other topics continued: The 12,863 posted on medRxiv in 2024 ...
  278. [278]
    Accelerating scientific progress with preprints - Nature
    May 29, 2024 · Many universities have also encouraged listing publications on preprint servers as part of applications for faculty hiring and tenure/promotion.
  279. [279]
    The FAIR Guiding Principles for scientific data management ... - Nature
    Mar 15, 2016 · The FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by ...The Fair Guiding Principles... · Examples Of Fairness, And... · Author Information
  280. [280]
    FAIR Data Principles at NIH and NIAID
    Apr 18, 2025 · The FAIR data principles are a set of guidelines aimed at improving the Findability, Accessibility, Interoperability, and Reusability of digital assets.
  281. [281]
    Global Impact - Open Data's Impact
    Open data has improved governments, empowered citizens, contributed solutions to complex public problems, and created new economic opportunities.
  282. [282]
    AI content is tainting preprints: how moderators are fighting back
    Aug 12, 2025 · Preprint servers are seeing a rise in submissions seemingly produced by paper mills or with help from AI tools.
  283. [283]
    Impacts of COVID-19 pandemic on the global energy system and the ...
    In this review, opportunities, challenges, and significant impacts of the COVID-19 pandemic on current and future sustainable energy strategies were analyzed ...
  284. [284]
    Antimicrobial resistance: Impacts, challenges, and future prospects
    Developing new antimicrobial drugs and alternative therapies is a crucial aspect in combating antibiotic resistance, alongside surveillance and diagnostic ...
  285. [285]
    Antimicrobial resistance: a concise update - The Lancet Microbe
    Sep 18, 2024 · In this rapidly advancing field, this Review provides a concise update on AMR, encompassing epidemiology, evolution, underlying mechanisms ( ...
  286. [286]
    New Study Examines Drivers of Government Investment in Energy ...
    Sep 12, 2022 · They found that energy funding among seven of the eight major economies grew from $10.9 billion to $20.1 billion between 2001 and 2018, an 84- ...
  287. [287]
    Energy Security, Climate Change, and Routines as Maladaptive ...
    Jul 29, 2024 · Abstract. Energy transitions suffer from a central political challenge. Future costs of the energy transition are directly linked to ...Missing: pandemic | Show results with:pandemic
  288. [288]
    Climate, conflict and energy security – our research shows how the ...
    Jun 27, 2025 · This resurgence comes amid a polycrisis marked by climate breakdown, social inequality, energy insecurity and geopolitical instability.
  289. [289]
    An Overview of the Recent Advances in Antimicrobial Resistance
    Strengthening surveillance and monitoring systems to track the emergence and dissemination of resistant pathogens.
  290. [290]
    The Global Challenge of Antimicrobial Resistance: Mechanisms ...
    Jul 23, 2025 · This review analyzes the molecular and ecological mechanisms underlying antibiotic resistance and evaluates global efforts aimed at containment ...
  291. [291]
    Recent advances in environmental antibiotic resistance genes ...
    Antibiotic resistance genes (ARGs) persistence and potential harm have become more widely recognized in the environment due to its fast-paced research.
  292. [292]
    U.S. Funding Is Insufficient to Address the Human Health Impacts of ...
    Despite these risks, extramural federal funding of climate change and health research is estimated to be < $3 million per year. Conclusions. Given the real ...
  293. [293]
    Cutting pollution could slow the spread of antimicrobial resistance
    Mar 12, 2023 · Climate change, biodiversity and antimicrobial resistance are also closely linked, with drugs further damaging ecosystems. Image: Unsplash/ ...