Fact-checked by Grok 2 weeks ago

Research Excellence Framework

The Research Excellence Framework (REF) is the United Kingdom's periodic system for evaluating the quality and broader effects of conducted in institutions (HEIs). It assesses through expert across 34 disciplinary units of assessment, focusing on three core elements: scholarly outputs such as publications and datasets; societal, economic, and cultural impacts arising from ; and the vitality and sustainability of the environment, including staff resources and infrastructure. Outcomes directly inform the allocation of approximately £2 billion in annual public funding for universities' capabilities by the four UK funding bodies—Research England, the Scottish Funding Council, Higher Education Funding Council for , and Department for the Economy in . Introduced in 2014 to replace the Research Assessment Exercise (RAE), which had evaluated research since 1986 primarily on publication quality, the REF expanded criteria to include impacts and environment, aiming to better capture research's real-world contributions and institutional support structures. The 2014 and 2021 exercises rated submissions on a scale from unclassified to world-leading (4*), with REF 2021 results showing 41% of activity as world-leading and 43% as internationally excellent, underscoring the sustained research strengths across disciplines and regions. These assessments have incentivized investments in research infrastructure and outreach, contributing to the global research prominence, as evidenced by consistent high rankings in international bibliometric indicators. Despite these outcomes, has drawn substantial for its administrative burdens, with preparation and submission costs estimated in the tens of millions per institution and total system-wide expenses exceeding £200 million per cycle, diverting resources from actual research. Institutions have exploited rules through practices like selective staff and output submissions, potentially inflating average scores while excluding lower-rated work, which undermines the framework's goal of comprehensive . The emphasis on demonstrable impacts has also prompted concerns over subjective judgments in panel evaluations and incentives for short-term, quantifiable outcomes over fundamental or long-horizon inquiry, though empirical reviews indicate mixed evidence on whether it systematically distorts research priorities.

Origins and Evolution

Pre-REF Assessments: The Research Assessment Exercise

The Research Assessment Exercise (RAE) served as the UK's principal system for periodically evaluating the quality of conducted in institutions, operating from 1986 to 2008 before being succeeded by the Research Excellence Framework. Conducted approximately every five years on behalf of the higher education funding councils, the RAE employed to rank research outputs submitted by departments across various subject areas, known as units of assessment (UoAs). Its core purpose was to inform the allocation of quality-related (QR) research funding, shifting from an earlier equity-based model to one that concentrated resources in high-performing areas, thereby aiming to enhance overall research excellence and international competitiveness. Originating amid concerns over declining UK research citation impact, the first exercise in 1986—termed the Research Selectivity Exercise (RSE) and led by the University Grants Committee under —introduced standardized of departmental research outputs to justify public funding and redistribute approximately 40% of institutional grants toward stronger performers. This initial round laid the groundwork for selectivity, with a follow-up in 1989 expanding to 152 UoAs and adopting a five-point rating scale to refine funding decisions, influencing around 50% of grants. By the 1992 RAE, formally renamed as such after the 1992 Further and Act, the process assessed quality across expanded institutions, using a 1-5 rating scale where higher grades received disproportionately more funding—such as grade 5 departments getting four times the allocation of grade 2—tying over 90% of research budgets to performance outcomes. Subsequent rounds iteratively refined the methodology while maintaining as the dominant approach. The 1996 RAE required submissions of up to four publications per researcher, introduced a 1-5* scale (with 5* denoting internationally excellent work), and covered 69 UoAs, excluding funding for the lowest-rated categories to sharpen incentives. In 2001, emphasis shifted toward evaluating the merit of individual outputs rather than departmental averages, incorporating provisions for special circumstances like interdisciplinary work, with ratings extended to include a rare 6* for world-leading research in some panels. The final 2008 iteration featured 67 UoAs, shifted to granular quality profiles assigning percentages of staff outputs to categories 1 through 4* (where 4* signified world-leading quality), and rewarded "pockets of excellence" within institutions to avoid penalizing uneven departmental strengths. Throughout its tenure, the RAE's peer-reviewed assessments—conducted by expert panels drawing on publications, research income, and staff details—prioritized output quality over quantity, enabling funding councils to benchmark institutions and allocate billions in annual QR funding selectively. This approach demonstrably boosted research productivity, as evidenced by rising mean ratings across rounds (particularly from 1996 to ) and sustained high citation impacts, though it faced critiques for encouraging game-playing, such as selective staff submissions, and burdening academics with preparation costs. By concentrating funding—QR grants totaled around £2 billion annually by the late 2000s—the RAE fostered in elite departments but widened disparities between high- and low-rated units, informing the REF's design to incorporate broader elements like research environment and societal .

Inception and Design of the REF

The Research Excellence Framework () emerged as a response to limitations identified in the preceding Research Assessment Exercises (RAEs), with its inception rooted in a government announcement in the December 2006 Pre-Budget Report to replace the RAE with a new system combining expert and metrics for greater efficiency and broader assessment scope. Following the 2008 RAE, the Funding Council for (HEFCE), alongside counterpart bodies in , , and , led the development of REF to inform the selective allocation of Quality-Related Research (QR) funding based on research excellence across higher education institutions. The framework was formally positioned as the successor to the RAE, aiming to evaluate not only research outputs but also their societal impacts and the sustainability of research environments. Development involved extensive consultations to refine the design, including an initial HEFCE consultation in November 2007 on the balance between metrics and expert review, which favored a hybrid approach over a purely metrics-based system initially proposed. A second consultation in September 2009 addressed core features such as the inclusion of impact assessment, leading to a pilot exercise on impact in 2010 that informed final guidelines. These processes ensured stakeholder input from academia and funding bodies, resulting in the publication of detailed assessment guidance by HEFCE in 2011, which outlined submission requirements for the inaugural REF covering the period from 2008 to 2013. Submissions opened in January 2013, with results published on December 18, 2014, and subsequent funding allocations in March 2015. The REF's design emphasized a unified applicable across all disciplines, structured around three equally weighted elements in terms of overall quality profile but with differentiated assessment contributions: research outputs (65% weighting), impacts (20%), and research environment (15%). Outputs were evaluated by peer panels for , , and , typically comprising up to four items per researcher from January 1, 2008, to December 31, 2013. Impacts were assessed via case studies demonstrating economic or societal benefits traceable to research conducted from January 1, 1993, onward, with submissions required to link impacts to specific units of assessment. The environment component drew on templates detailing strategy, staffing, and infrastructure from January 1, 2008, to July 31, 2013, to gauge vitality and sustainability. Peer review by expert sub-panels, organized into main panels by broad subject areas, formed the core methodology, supplemented by selective metrics for outputs where appropriate. In contrast to the RAE's focus primarily on peer-reviewed outputs, REF introduced impact and environment as explicit criteria to capture broader contributions and institutional capacity, while incorporating to reduce burden and enhance objectivity—though expert judgment remained predominant following consultation against full metric reliance. This hybrid model addressed criticisms of the RAE's high costs and game-playing incentives, such as selective staff submissions, by mandating returns for all -active staff and emphasizing demonstrable real-world effects. The design prioritized , , and , with guidelines requiring institutions to address staff circumstances affecting submissions. Overall, REF aimed to incentivize high-quality, impactful while informing funding decisions that rewarded excellence, with 82% of research rated world-leading or internationally excellent in the 2014 exercise.

Implementation of REF 2014

Institutions submitted data to the REF 2014 through an online system managed by the Higher Education Funding Council for England (HEFCE), with the census date for eligible staff set at 31 July 2013 and the submission deadline on 29 November 2013. A total of 154 higher education institutions (HEIs) participated, producing 1,911 submissions across 36 units of assessment (UOAs) grouped under four main panels: Panel A (medicine, health, and life sciences), Panel B (physical sciences and engineering), Panel C (social sciences), and Panel D (arts and humanities). Each submission included research outputs (typically four per selected researcher), impact case studies (with evidence of socioeconomic benefits from research conducted between 1993 and 2013), and details on the research environment, weighted at 65%, 20%, and 15% respectively in the overall evaluation. The assessment process relied on peer review by over 1,000 expert panel members, including academics and research users, appointed between 2010 and 2012 to ensure disciplinary expertise and independence. Panels applied standardized criteria to score outputs, impacts, and environments on a four-point scale (4* for quality that is world-leading, down to unclassified), with calibration exercises in early to align scoring across sub-panels and main panels. Impact evaluation involved reviewing 6,975 case studies and institutional templates, allocated to 2-4 reviewers per item, followed by individual scoring, moderation meetings, and limited auditing of selected cases to verify evidence. Sub-panels handled initial reviews within clusters, escalating discrepancies for resolution, while main panels oversaw cross-panel consistency without direct rescoring. Results were finalized after panel deliberations throughout 2014 and published on 18 December 2014, providing quality profiles for each submission rather than overall institutional rankings. These outcomes informed research funding allocations by the four UK higher education funding bodies for the period 2015-16 to 2020-21, distributing approximately £2 billion annually based on the volume and quality of , with 4* rated activity attracting the highest funding intensity. The implementation marked the first inclusion of in the UK's periodic research evaluation, introducing requirements for demonstrable evidence of real-world benefits, though panels noted variability in evidence quality and the need for clearer pathways linking outputs to impacts.

REF 2021 and Key Modifications

The Research Excellence Framework (REF) 2021 represented the second periodic assessment of , with submissions required by 31 March 2021 following delays from the original November 2020 deadline, and results published on 12 May 2022. In response to the Review's recommendations following REF 2014, the framework shifted toward greater inclusivity by mandating submission of all academic staff with a "significant responsibility for ," eliminating the selective approach used previously and resulting in a 46% increase in submitted staff full-time equivalents to 76,132 across 157 institutions. This change aimed to reduce gaming of the system but raised administrative burdens, with institutions required to declare staff circumstances—such as career breaks or health issues—affecting 41% of submitted staff to adjust output expectations, up from 29% in REF 2014. Assessment weightings were adjusted to emphasize broader research contributions: outputs weighted at 60% (down from 65% in 2014), impact at 25% (up from 20%), and environment at 15% (down from 20%). Outputs were assessed on quality, significance, and rigor, with a minimum average of 2.5 per full-time equivalent researcher but no individual minima, and portability limited to those generated during the 2014–2021 period at the submitting institution; pre-2014 outputs were ineligible unless substantially revised. Impact evaluation expanded to include case studies demonstrating effects from research conducted between 1 August 1993 and 31 July 2020, with impacts realized from 1 August 2014 to 31 December 2020 (extended due to pandemic disruptions), requiring evidence of rigorous pathways from research to outcomes across economic, societal, cultural, and other domains. The environment component transitioned to an institutional-level narrative (via REF 5a template) supplemented by unit-specific data, focusing on strategy, resources, and support for , people, and , rather than solely unit-of-assessment templates as in 2014. An policy mandated that journal articles and conference proceedings published from 1 April 2016 to 31 July 2020 be deposited in institutional repositories within three months of acceptance or publication, with exceptions for or publisher restrictions, to promote while balancing compliance challenges. Interdisciplinary received explicit recognition through criteria and identifiers, with 6,340 cross-referrals across 34 units of assessment. COVID-19 prompted targeted revisions, including a four-month timeline extension, virtual or hybrid panel assessments from May 2021, and allowances for narrative extensions on disruptions; physical outputs (13,176 items) were handled flexibly via USB distribution to 1,100 assessors. Overall, 185,594 outputs and 6,781 case studies were submitted, with total costs at £16.7 million—16% above REF 2014 but under budget—reflecting efficiencies from processes despite heightened , , and inclusion measures like diverse panel recruitment from over 130 universities. These modifications prioritized fairness and reduced environmental footprint, with hybrid meetings replacing 250 in-person sessions from 2014.

Developments Toward REF 2029

In May 2023, (UKRI) published initial decisions for the next exercise, initially termed 2028 but later confirmed as 2029, following recommendations from the Future Research Assessment Programme (FRAP). These decisions emphasized reducing administrative burden, shifting assessment from individual researchers to institutional and disciplinary levels, and rebalancing evaluation elements to better reflect research culture and societal benefits. Central reforms include a new structure with three weighted assessment elements: Contributions to Knowledge and Understanding (CKU) at 50%, Engagement and (E&I) at 25%, and People, , and Environment (PCE) at 25%, subject to validation via a PCE pilot. CKU submissions decouple outputs from specific staff, requiring only institutional employment links and allowing unit-level reductions for special circumstances, while eliminating minimum output quotas per researcher. E&I broadens to include institutional statements and fewer mandatory case studies (one minimum for units below 9.99 staff), removing the prior 2* quality threshold. PCE introduces institution-wide narratives on research environment, informed by advisory panels on people/ and research diversity. Submissions occur across 34 units of assessment, using average staff volume from Statistics Agency data, with no individual staff returns. Open access policies for journal articles and conference contributions, refined after a 2024 consultation, mandate post-publication deposits within three months starting January 1, 2026, permit certain licenses beyond CC-BY, and add exceptions while excluding longform outputs initially. Guidance documents advanced progressively, with the overview published on January 16, 2025, detailing submission processes and eligibility—such as participation requests by June 30, 2027, for institutions lacking research degrees. Detailed CKU and guidance followed on June 12, 2025, stressing transparency, equity, and inclusion in staff identification, alongside support for diverse roles like technicians. peer review panels were appointed in 2025 to handle assessments. However, on September 5, 2025, UKRI announced a pause in criteria-setting and final guidance publication, initiated by Science Minister , to align with government priorities; this short stock-taking period allows continued work on PCE piloting, potential lighter tracks for less research-intensive institutions, and funding modifications for collaboration, with autumn sector engagement planned. The exercise aims for results in December 2029 to inform annual allocation of approximately £2 billion in public funding.

Assessment Framework

Units of Assessment and Institutional Eligibility

The Research Excellence Framework (REF) structures its assessments around discipline-specific units of assessment (UOAs), into which higher education institutions (HEIs) submit their research outputs, impact case studies, and environment data. These UOAs ensure evaluations are conducted by expert sub-panels with relevant disciplinary knowledge, covering the full spectrum of academic research. For REF 2021, there were 34 UOAs grouped under four main panels: Main Panel A (Medicine, Health and Life Sciences, UOAs 1–6), Main Panel B (Physical Sciences, Engineering and Mathematics, UOAs 7–11), Main Panel C (Social Sciences, UOAs 12–26), and Main Panel D (Arts and Humanities, UOAs 27–34). Institutions select UOAs based on the primary disciplinary focus of their staff's research, with submissions required to align with the defined scope of each unit to facilitate peer review by specialists. This structure, retained for REF 2029 with minor adjustments to sub-panel configurations, allows for granular assessment while aggregating results at the institutional level for funding purposes. Institutional eligibility for REF submissions is restricted to UK HEIs recognized by the four funding councils—Research England, the Scottish Funding Council, the Higher Education Funding Council for Wales, and the Department for the Economy in Northern Ireland—as undertaking activities eligible for quality-related (QR) funding. Only these bodies' recognized providers may participate, ensuring the framework supports the allocation of public funding to domestic institutions. In REF 2021, 157 such HEIs submitted across the UOAs, reflecting the population of qualifying universities and colleges with sufficient volume. Eligibility does not extend to non- entities or private providers without funding body recognition, maintaining the REF's focus on publicly funded . Institutions must also comply with submission requirements, including a for staff selection and data provision, verified against funding council lists prior to assessment. For REF 2029, the criteria remain aligned with this model, emphasizing institutional enabled within the HE sector.

Submission Processes and Data Requirements

Institutions submit data to the REF through an online submission system managed by the Higher Education Statistics Agency (HESA), following preparation guided by the official REF guidance documents. For REF 2021, submissions were due by midday on 31 July 2020, though extensions were granted due to the COVID-19 pandemic, allowing institutions to import structured data files covering staff, outputs, impacts, and environment for each unit of assessment (UoA). Prior to final submission, higher education institutions (HEIs) must develop and submit a Code of Practice (CoP) outlining internal procedures for staff selection, output portability, and data integrity, which undergoes review and approval by the four UK funding bodies to promote equity and auditability. Key data requirements for REF 2021 included Category A staff details—defined as individuals employed by the HEI on or after 1 August 2013 with a minimum of 0.2 (FTE) research involvement and significant responsibility for by 31 July 2020—linked to a minimum average of 2.5 outputs per FTE, such as peer-reviewed articles, books, or other scholarly works meeting criteria where applicable. case studies required evidence of societal or economic benefits arising from conducted between 1 August 1993 and 31 July 2020, submitted via standardized templates (REF5) with details on underpinning , pathways to , and verifiable outcomes. templates (REF4) demanded quantitative data on income, doctoral completions, and staffing profiles, alongside narrative statements up to 10,000 words per UoA on strategy and support. All elements were subject to audit trails, with non-compliance risking exclusion. For REF 2029, submission processes shift toward reduced individual-level scrutiny, eliminating mandatory staff submissions and decoupling outputs from specific researchers to emphasize institutional contributions. Outputs are reclassified as Contributions to and Understanding (CKU), selected via CoP-guided processes without minimum per-person quotas, allowing up to five CKU per researcher nomination but prioritizing institutional over individual attribution; these must demonstrate significant advancement in knowledge within the UoA. expands to include a broader range of activities with evidence from 1 January 2014 onward, while assessments focus on people, culture, and infrastructure through templates capturing institutional strategies, with submissions anticipated in late 2027. mandates evolve, requiring post-publication deposit by 1 January 2026 for eligible outputs, exempting monographs contracted before that date. These changes aim to lower administrative burdens, estimated at a 50% reduction in output volume compared to REF 2021, while maintaining peer-reviewed rigor.

Peer Review Panels and Expertise

The Research Excellence Framework (REF) employs a of expert conducted by four main panels and 34 sub-panels, each aligned with subject-based units of assessment (UoAs). Main panels oversee the overall assessment process, ensuring consistent application of criteria across sub-panels, calibrating standards, and approving final results; they include chairs of sub-panels, interdisciplinary experts, international members for , and specialists in research application. Sub-panels perform the core evaluation of institutional submissions, applying discipline-specific criteria to outputs, impacts, and environments through expert judgment informed by protocols. Panel membership is recruited via nominations from subject associations, learned societies, and research organizations, with selections prioritizing individuals of established and deep expertise in relevant fields to reflect the 's research community's strengths. For REF 2021, nominations closed on December 20, 2017, emphasizing balance in expertise, including senior academics for outputs and environments, research users (non-academic experts) for impact case studies, and international advisers; equality and diversity considerations guide appointments without compromising scholarly rigor. Sub-panels may expand during the assessment phase by adding specialists, such as those for interdisciplinary work or language-specific outputs, to address gaps in coverage. Expertise within panels draws from academic leaders, with main panel chairs for REF 2021 exemplifying domain authority: Professor for Main Panel A (, , life sciences), Professor David Price for Main Panel B (physical sciences, ), Professor Jane Millar for Main Panel C (social sciences), and Professor Dinah Birch for Main Panel D (arts and humanities). This composition enables rigorous, field-specific scrutiny, where sub-panels assess quality thresholds (e.g., 4* for world-leading, 3* for internationally excellent) via double-blind review for outputs in many cases, supplemented by bibliometric data where appropriate, while main panels harmonize judgments to mitigate subjectivity. Panels operate in phases—criteria-setting, submission review, and assessment—with documented working methods to maintain transparency and accountability.

Evaluation Criteria

Outputs and Their Quality Assessment

Research outputs in the REF, such as peer-reviewed journal articles, monographs, conference contributions, software, exhibitions, and performances, are evaluated through expert by sub-panels organized into main panels covering broad disciplinary areas. These panels consist of academics and, in some cases, users of research, who assess outputs against three core criteria: , referring to the novelty and innovative contribution to , methods, or understanding; , encompassing the and of the work within its field or beyond; and , evaluating the soundness of the , , and intellectual coherence. In REF 2021, outputs carried a 60% weighting in the overall quality profile for each unit of assessment (UoA), with sub-panels reviewing a total of 185,594 items across 34 UoAs, often involving double review and cross-referral for interdisciplinary or specialist expertise. Quality is graded on a five-point scale: 4* for world-leading in terms of , , and ; 3* for internationally excellent; 2* for internationally recognized; 1* for nationally recognized; and unclassified for work below the standard of national recognition. Institutions submit an average of 2.5 outputs per (FTE) researcher in the submitting pool, with a minimum of one and maximum of five per individual, allowing flexibility for varied while emphasizing over ; only outputs published during the assessment period or attributable to the institution are eligible. prioritizes qualitative judgment of the output's content, explicitly avoiding reliance on metrics or prestige to align with declarations like , though panels may consider contextual evidence of impact within the assessment. For REF 2029, the assessment evolves into the "Contribution to Knowledge and Understanding" (CKU) element, weighted at 50%, which builds primarily on outputs but incorporates broader disciplinary contributions evidenced through institutional statements and quantitative data. This shift aims to capture cumulative advancement rather than isolated items, with sub-panels applying similar OSR criteria but emphasizing evidence of field-wide progress, potentially reducing gaming through output maximization. Outputs remain central, but the framework encourages recognition of diverse formats, including practices, while maintaining peer-led evaluation to ensure rigour against verifiable standards.

Impact Measurement and Case Studies

Impact in the Research Excellence Framework (REF) constitutes 25% of the overall assessment weight in REF 2021, up from 20% in REF 2014, with evaluation based exclusively on submitted impact case studies rather than broader institutional narratives. These case studies must evidence tangible effects or benefits beyond , such as changes to the , , culture, , services, health, or the environment, arising from conducted at the submitting higher education institution (HEI). Eligible impacts are limited to those occurring between 1 August 2013 and 31 July 2020 for REF 2021, while the underpinning must have been produced between 1 August 1993 and 31 July 2020, with a requirement that it was active during the 2008–2013 assessment period. Institutions submit case studies using a standardized template outlined in Annex G of the REF guidance, which requires sections detailing the underpinning (including outputs and context), references to that , and a narrative of the achieved, supported by corroborative such as testimonials, metrics, or third-party documentation. The template emphasizes demonstrable causal links between the research and impacts, with assessed for and plausibility rather than exhaustive quantification. In REF 2021, HEIs collectively submitted over 6,800 impact case studies across 34 units of assessment, covering diverse sectors including , , and . Assessment occurs through by REF sub-panels, comprising academic experts and research users (non-academics from beneficiary sectors), who evaluate case studies holistically against two primary criteria: reach, defined as the breadth of audiences, sectors, or geographical areas affected, and , encompassing the scale, magnitude, and beneficial changes realized. These criteria are not scored separately but integrated to assign an overall quality profile on a five-point scale: 4* (world-leading in terms of reach and ), 3* (internationally excellent), 2* (recognised internationally), 1* (recognised nationally), or unclassified (below national quality). Panels verify claims through the provided , cross-referencing with outputs and templates, and may consult external assessors for specialized validation, prioritizing empirical substantiation over anecdotal assertions. This methodology has been credited with incentivizing evidence-based demonstration of research benefits, as evidenced by REF 2021 analyses showing impacts spanning local to global scales, though critiques highlight potential inconsistencies in peer judgments due to subjective interpretations of causality and evidence thresholds. Sub-panel guidelines require calibrated scoring via mock exercises and inter-panel moderation to mitigate variability, ensuring assessments reflect verifiable contributions rather than institutional prestige alone. In practice, high-rated case studies often feature quantifiable metrics, such as policy adoptions influencing millions or economic returns exceeding research costs by factors of 10 or more, underscoring the framework's emphasis on causal realism in impact attribution.

Research Environment Evaluation

The research environment in the Research Excellence Framework (REF) evaluates the , , and support structures underpinning research activities within units of assessment (UoAs), including strategies for development, research student supervision, , and collaboration. In 2021, this element accounted for 15% of the overall assessment weighting, with the remaining portions allocated to outputs (60%) and (25%). Submissions comprised a template limited to 10 pages per UoA, supplemented by quantitative indicators such as (FTE) staff numbers, doctoral completions (targeting 20-30% of submitted as research students over the assessment period), and trends in research income from 2013/14 to 2019/20. Peer review panels applied a star-based grading scale (4* for world-leading quality to 1* for nationally recognized, with unclassified below that), focusing on criteria such as the unit's capacity to sustain high-quality , for diverse researchers including early-career and those with protected characteristics, and of like facilities and funding success rates. Panels calibrated scores through against national norms and cross-panel consistency exercises, emphasizing demonstrable over unsubstantiated claims; for instance, high scores required proof of equitable practices, such as progression for under-represented groups, rather than aspirational statements. Quantitative metrics informed but did not determine scores, as human judgment predominated to assess contextual factors like discipline-specific challenges. For REF 2029, the environment weighting is set to rise to 25%, with a shift toward greater emphasis on institutional-level elements, including research culture metrics piloted in areas like , , , , and equity. This includes potential standalone assessment of people and culture (e.g., via surveys on workload, openness, and inclusivity), alongside continued UoA-specific evaluation, aiming to better capture systemic factors influencing research amid critiques that prior templates underemphasized cultural dynamics. Panels will retain primacy but integrate more standardized indicators to enhance comparability, with guidance stressing evidence of causal links between environment features and research outcomes.

Grading Scales and Profiles

The Research Excellence Framework (REF) evaluates quality using a five-point scale applied separately to outputs, impacts, and the research environment within each unit of (UOA). This scale comprises grades of 4* (world-leading), 3* (internationally excellent), 2* (internationally recognised), 1* (nationally recognised), and unclassified (below nationally recognised standards), with assessments calibrated against criteria of , , and for outputs; reach and for impacts; and and , research support, and collaborative environment for the environment. Panels determine grade boundaries through exercises to ensure consistency, with 4* reserved for work demonstrating exceptional advancement in knowledge or practice, while lower grades reflect diminishing international or national comparability. For each submission, panels produce distinct profiles indicating the percentage of activity assigned to each grade level. Output profiles reflect the distribution across submitted items (e.g., publications or artefacts), weighted by staff time; impact profiles derive from case studies and additional evidence, capturing non-academic benefits; and environment profiles assess the UOA as a whole based on submitted templates and supporting like and metrics. These element-specific profiles enable granular of strengths, with unclassified portions typically minimal in high-performing UOAs but indicating areas failing basic thresholds. An overall quality profile is then calculated as a weighted composite: 60% from outputs, 25% from impacts, and 15% from environment, yielding a distribution of percentages across the scale that informs institutional rankings and funding. This aggregation, while promoting balanced evaluation, has drawn scrutiny for potentially diluting element-specific insights, as the weighted average smooths disparities (e.g., strong outputs offsetting weaker impacts). Grade point averages (GPAs), derived by assigning numerical values (4 for 4*, etc.) to profile percentages, provide a supplementary metric for comparisons, though profiles themselves prioritize nuanced distributions over scalar summaries.
GradeDescription
4*World-leading in , , and (outputs); outstanding reach and (impacts); world-leading support for sustained excellence (environment).
3*Internationally excellent in , , and (outputs); significant reach and importance (impacts); internationally excellent and (environment).
2*Recognised internationally in , , and (outputs); recognised but modest additional reach and (impacts); recognised internationally as conducive to excellent (environment).
1*Recognised nationally in , , and (outputs); limited additional reach and (impacts); recognised nationally as supportive of (environment).
UnclassifiedDoes not meet recognised standards.

Outcomes and Resource Allocation

REF 2021 Results and Institutional Performance

The REF 2021 results, published on 12 May 2022, evaluated submissions from 157 UK higher education institutions (HEIs) across 34 subject-based units of assessment, covering research activity from the 2014–2020 period. Overall, 41 per cent of submitted outputs received a 4* rating for world-leading quality, while 43 per cent were graded 3* for internationally excellent quality, reflecting a high baseline of performance across the sector. Impact case studies and environment templates similarly garnered strong evaluations, with the increased weighting of impact (to 25 per cent from 20 per cent in REF 2014) contributing to elevated overall scores, particularly in social sciences and arts/humanities where mean GPAs rose by approximately 0.24–0.25 points. Institutional performance was measured via overall quality profiles, yielding a grade point average (GPA) on a 1–4 scale (weighted 60 per cent outputs, 25 per cent , 15 per cent environment), alongside research power (GPA adjusted for submission volume). achieved the highest overall GPA of 3.63, surpassing the previous leader, the Institute of Cancer Research (ICR), which ranked second despite its narrower submission scope of just two units. The and placed joint third, underscoring the dominance of selective, research-intensive institutions in quality metrics. Specialized entities like the ICR and London School of Hygiene and Tropical Medicine (LSHTM) excelled in niche areas, with LSHTM posting the top GPA of 3.78. Larger comprehensive universities, such as and the , demonstrated superior power due to higher submission volumes, retaining positions in the top five for aggregate output despite slightly lower per-unit GPAs compared to elite peers. Mid-tier and post-1992 institutions showed gains in select units, with overall sector improvements attributed to refined assessment criteria emphasizing real-world impact over pure volume. However, disparities persisted: "" HEIs (, , , , LSE, ) captured a disproportionate share of top grades, comprising most of the top 10 by GPA, though their collective funding allocation faced marginal dilution from broader excellence distribution. These outcomes highlighted competitive stratification, with 93 per cent of at high performers like rated 3* or above, versus more variable profiles at smaller or teaching-focused providers.

Mechanisms for Funding Distribution

The Research Excellence Framework (REF) serves as the primary mechanism for distributing approximately £2 billion in annual Quality-Related (QR) funding across higher education institutions, administered by the four funding bodies: Research England for , the Scottish Funding Council (SFC) for , the Higher Education Funding Council for (HEFCW) for , and the Department for the Economy (DfE) for . This supports research infrastructure, staff costs, and knowledge exchange, with institutions retaining discretion over its use. Allocations are formulaic, deriving from REF's overall quality profiles—comprising outputs, , and sub-profiles—adjusted for research volume measured in full-time equivalent (FTE) staff. Funding emphasizes high-quality research rated 3* (internationally excellent) and 4* (world-leading), using multipliers such as 1.0 for 4*, 0.6 for 3*, and lower for lesser grades, multiplied by volume to compute "REF power." In , which receives the largest share (around £1.3 billion mainstream QR annually), Research England applies REF 2021 results through a weighted formula: 70% for outputs, 15% for , and 15% for . This differs from REF's weightings (60% outputs, 25% , 15% ) to prioritize in decisions. For 2025-2026, total QR allocations reached £1,987 million, incorporating mainstream QR (£1,303 million), research degree program supervision (£344 million), charity support (£219 million), and business research (£114 million), with no changes to weightings or methods from prior years to ensure allocation stability. Post-REF results, such as those from 2021, inform allocations for several years, blending with historical performance to mitigate volatility. The devolved funding bodies adapt REF outcomes to national priorities, though specifics vary and are less transparently formulaic than in . Scotland's SFC, for instance, integrates REF quality shares but incorporates philanthropic income and regional equity factors. Wales and Northern Ireland similarly use REF to gauge excellence while adjusting for local policies, such as supporting smaller institutions or specific disciplines, resulting in allocations that may diverge from pure REF power metrics. Across all bodies, QR funding post-REF 2021 has included targeted supplements, like flexible allocations for research culture (2022-2024), but core mechanisms remain tied to empirical REF scores rather than competitive bidding. This approach aims to reward sustained excellence while enabling devolved flexibility, though it has prompted debates on inter-nation disparities.

Rankings, Incentives, and Competitive Dynamics

The produces rankings at the level of institutions (HEIs) and their units of (UoAs) via aggregated quality profiles, which quantify the proportion of research rated at each grade: world-leading (4*), internationally excellent (3*), recognised internationally (2*), and nationally recognised (1*). For REF 2021, results released on 12 May indicated that 41 per cent of outputs across all submissions achieved 4* status, while 43 per cent were rated 3*. These profiles feed into broader institutional rankings by research power—computed as the volume of submitted activity multiplied by the grade point average (GPA)—enabling comparisons that highlight top performers, such as the Institute of topping biological sciences in weighted quality. Rankings are not absolute but relative, with full datasets available for download, allowing analysis by UoA, institution, or overall power. Incentives arise chiefly from REF's integration with Quality-related Research (QR) funding, where outcomes determine the distribution of approximately £2 billion annually from UK funding councils to HEIs, with higher-rated research attracting a disproportionate share—often weighted heavily toward 4* activity. This structure motivates institutions to enhance research portfolios, evidenced by empirical studies showing accelerated publication outputs and submission volumes in the lead-up to REF deadlines, such as a detectable spike in research activity preceding the 2014 assessment. For researchers, incentives include career advancement tied to REF-eligible contributions, though panels emphasize quality over quantity, with 100 per cent of eligible staff now required to be returned under rules to promote inclusivity in assessment. Competitive dynamics manifest in a funding contest among over 150 submitting HEIs, where gains for high performers come at the expense of others, driving strategies like selective recruitment of REF-optimizing academics—often timed near submission cycles to boost volume and GPA. This rivalry has elevated overall research standards, as monitoring via REF correlates with sustained performance uplifts in monitored metrics like , yet it also amplifies institutional hierarchies, with pre-1992 universities dominating top rankings due to scale advantages. Post-REF 2021, such dynamics influenced QR allocations, rewarding powerhouses while pressuring lower-ranked entities to consolidate or specialize, though critiques note potential for short-termism over long-term innovation.

Effects on UK Research Ecosystem

Evidence of Quality Improvements

Empirical analyses using difference-in-differences methods comparing UK universities to matched US institutions as a control group demonstrate that the 2014 REF cycle prompted a 41.37% increase in research outputs per department between 2009 and 2014, with a concurrent rise in the proportion of publications in high-quality journals (rated 3*, 4*, or 4** under the Academic Journal Guide by the Chartered Association of Business Schools). This shift towards more prestigious venues suggests an enhancement in the perceived or assessed quality of submitted outputs, as institutions strategically prioritized work aligned with REF's emphasis on excellence. The effect was more pronounced in Russell Group universities, indicating competitive pressures amplified quality-oriented behaviors in top-tier institutions. Bibliometric indicators further corroborate sustained quality gains post-REF implementation. UK research exhibits field-weighted impacts 26% above the global average for internationally co-authored papers, reflecting robust and reception in peer communities. REF's periodic assessments have been linked to these trends by incentivizing investments in environments, with over 80% of evaluated outputs in REF classified as world-leading or internationally excellent, up from prior RAE cycles when adjusted for methodological changes. However, per-author remained stable, implying that volume-driven expansions via hiring contributed to aggregate improvements without proportional gains. Longer-term incentives from appear to foster diversification in topics post-evaluation periods, potentially enhancing overall and by reducing deadline-induced conservatism. Analyses of 3.6 million reveal that outputs immediately preceding REF deadlines suffer from lower citations and publication in lower-impact journals, but this reverses afterward, supporting the framework's role in elevating baseline standards through accountability and funding ties. These patterns align with causal from performance-based systems, where evaluation cycles correlate with heightened focus on rigorous, impactful work.

Societal and Economic Contributions

The Research Excellence Framework (REF) has driven societal contributions by systematically evaluating and incentivizing research impacts beyond academia, as demonstrated through impact case studies (ICSs) submitted by UK higher education institutions. In REF 2021, institutions provided 6,781 ICSs, of which 6,361 were publicly analyzed, covering benefits in , , and across local, national, and international scales. These cases, peer-assessed for quality, highlight multidisciplinary research—72% drawing from at least two fields—translating into tangible outcomes such as enhanced and community cohesion. For instance, the University of Oxford's development of the , supported by REF-evaluated research, resulted in over 2.5 billion doses supplied globally by 2020, contributing to mitigation efforts. Similarly, the RECOVERY trial, informed by UK research excellence, saved approximately 650,000 lives worldwide by optimizing treatments. Economic contributions stem from REF's role in allocating around £2 billion annually in public funding to high-performing institutions, prioritizing research with verifiable non-academic returns. In REF 2021 analyses, 34% of ICSs referenced financial metrics or (ROI), with 58 cases quantifying ROI in detail, often exceeding national averages. Business and featured in 298 ICSs, fostering SME growth, innovation, and job retention. The University of Manchester's Urban Living Labs, for example, secured £26 million in infrastructure investment, doubling rates and reducing van trips by 20,000 km annually in the region. Another case, the Centre for Global , achieved a 5.5:1 ROI ratio, surpassing the average of 2.8:1 through and commercialization. Earlier REF cycles provide longitudinal evidence of sustained effects. REF 2014's 6,975 ICSs evidenced societal benefits in 75% of cases (e.g., health advancements aiding millions) and economic impacts in 25%, including £4.7 billion in activity from research alone. These outcomes, reaching thousands of businesses and global beneficiaries, underscore REF's mechanism in channeling resources toward high-impact work, though self-reported elements in ICSs warrant panel validation to mitigate potential overstatement. Overall, REF has amplified UK research's role in sectors like (e.g., 192 ICSs on cancer diagnostics) and (80 on net zero), with 46% of analyzed ICSs linked to funding.
Key REF Impact ExamplesSectorSocietal BenefitEconomic/Quantified Benefit
Vaccine (REF 2021)HealthGlobal pandemic response2.5 billion doses by 2020
RECOVERY Trial ( 2021)HealthTreatment optimization~650,000 lives saved
Urban Living Labs ( 2021)Improved mobility£26m investment; 20,000 km reduced trips
Centre for Global ( 2021)Innovation transfer5.5:1 ROI
Research ( 2014)Industry advancements£4.7bn activity generated

Shifts in Academic Practices and Careers

The introduction of output portability in REF 2014, whereby research outputs could be attributed to the submitting regardless of the researcher's prior , prompted to engage in strategic of high-performing academics to enhance their REF submissions. This practice intensified competition for established researchers with strong publication records, leading institutions to offer incentives such as higher salaries and reduced teaching loads to secure "" hires capable of delivering 4* rated outputs. Empirical analysis of REF 2014 data indicates that such gaming behaviors contributed to a concentration of top-rated in select institutions, with portability enabling a reallocation of credited excellence across the sector. REF assessments have driven a measurable increase in research productivity, as evidenced by a study comparing UK universities' output against counterfactual scenarios, which found that REF 2014 significantly boosted publication volumes while maintaining or improving citation impacts. The weighting of impact—rising to 25% in REF 2021—shifted practices towards documenting societal and economic contributions through case studies, compelling researchers to allocate time beyond traditional outputs to evidence non-academic benefits, such as policy influence or industry applications. This evolution, while fostering broader accountability, has been linked to short-term behavioral adjustments, including deadline-driven surges in submissions, as institutions optimize portfolios around REF cycles. Career trajectories in academia have become more closely aligned with REF performance metrics, with promotions and tenure decisions increasingly predicated on an individual's contribution to institutional submissions, including outputs rated at world-leading (4*) levels. Early-career researchers (ECRs) face amplified "" pressures, as REF eligibility criteria—such as demonstrating significant responsibility for research—exclude many from submissions unless they produce high-caliber work early, exacerbating precarious contracts and delaying permanent positions. Surveys of clinical academics participating in REF 2021 revealed heightened administrative burdens and selection biases, with non-submitted staff reporting diminished career progression opportunities due to perceived lower REF value. Overall, approximately 70% of surveyed researchers viewed REF as exerting a net negative influence on , citing intensified and metric fixation over diverse scholarly activities.

Criticisms and Counterarguments

Claims of Methodological Inefficiencies

Critics of the Research Excellence Framework (REF) have highlighted its methodological inefficiencies, particularly the disproportionate administrative and financial burdens relative to the scale of funding distributed, estimated at around 0.3% of total research funding in earlier iterations. The REF 2014 process incurred total costs of £246 million, with £212 million borne by higher education institutions for preparation alone, equating to approximately £4,000 per submitted researcher. These costs represented a 43% increase over the 2008 Research Assessment Exercise (RAE) when excluding the newly introduced impact element, which added £55 million in preparation expenses due to the need for detailed case studies and evidence gathering. Such resource demands arise from extensive internal processes, including mock REF exercises, staff selection criteria, and compliance with equality and personal circumstances guidelines, which consumed significant central management time—up to 21% of submission efforts. The core of REF's methodology has been faulted for subjectivity and inconsistency, exacerbating inefficiencies by requiring panel members to dedicate an average of 533 hours (equivalent to 71 days) per person, monetized at professorial salary levels. This labor-intensive approach, reliant on expert judgment without systematic use of metrics like factors (explicitly prohibited), promotes and , as panels favor established, mainstream outputs over interdisciplinary or transformative work, which faces unclear criteria and potential underrepresentation in submissions. Biases toward larger departments, -intensive institutions, and basic over applied further undermine efficiency, with high correlations between assessment elements (e.g., 0.73 between and scores) suggesting redundancy in evaluating outputs, , and separately. Impact assessment, introduced in REF 2014, has drawn particular scrutiny for methodological flaws, including challenges in distinguishing quality levels (e.g., 2-star versus 3-star cases) and over-reliance on narrative statements that amplify language-based biases linked to reviewer affiliations. Preparation for these elements diverted substantial institutional resources, with outputs accounting for 45% of time spent, often leading to behaviors like "salami slicing" of publications to maximize countable items rather than prioritizing depth. Critics argue this complexity displaces core research activities, with surveys indicating 67% of academics reporting excessive time on REF outputs and 34% experiencing negative health impacts from workload pressures. Overall, the absence of integrated data systems, such as comprehensive Research Information Systems, forces ad-hoc data collection, amplifying inefficiencies across cycles.

Burdens on Researchers and Potential Gaming

The Research Excellence Framework imposes substantial administrative burdens on researchers, including extensive documentation of outputs, case studies, and assessments, which divert time and resources from core research activities. The REF 2014 process cost UK institutions £212 million out of a total £246 million expenditure, representing a 133% increase from the £66 million cost of the preceding Research Assessment Exercise in , with much of the burden falling on selection and submission preparation. This workload contributed to heightened stress, as evidenced by 29.2% of submitted researchers providing fewer than the required four outputs due to special circumstances, reflecting pressures that discouraged risky or interdisciplinary work. A 2022 survey of 73 clinical academic members found that 80% reported longer working hours attributable to 2021 preparations, with 44% experiencing or mental ill-health and 26% distracted from duties; participants described the process as wasting "huge amounts of time" on low-value tasks like compilation, exacerbating opportunity costs for patient-facing research. Reforms introduced post-Stern Review in 2016, such as averaging two outputs per researcher and institutional-level submissions, aimed to alleviate individual burdens but have not fully mitigated them, as ongoing preparations for 2029 continue to demand significant internal administrative efforts. Potential gaming behaviors further undermine the framework's integrity, with pre-2021 institutions selectively submitting staff to inflate average quality profiles, creating an incomplete that distorts institutional comparisons—for instance, reduced eligible staff submissions from 1,030 in 2008 to 738 in 2014, boosting its grade point average ranking to sixth nationally at the expense of research volume. Such strategies, including fractional contract hiring and pre-census staff reallocation, drove salary inflation and career disruptions, prompting the to recommend non-portable outputs and mandatory inclusion of all research-active staff to curb incentives for exclusion. Despite 2021 implementing these changes—requiring all eligible staff returns with a minimum two outputs—critics note persistent internal gaming, such as mock assessments pressuring underperformers into teaching-only roles, with 2029 guidance explicitly acknowledging risks of unfair advantages through resource concentration.

Empirical Defenses and Demonstrated Benefits

Empirical analyses of evaluation cycles indicate that deadlines create short-term incentives for researchers, leading to elevated publication quality post-assessment. A study examining over 3.5 million publications found that outputs immediately preceding deadlines exhibited lower citation rates and appeared in lower-impact journals, but this trend reversed sharply afterward, with post-deadline work garnering higher citations, greater diversity, and publication in journals with longer half-lives, suggesting sustained improvements in ambition and quality. The 2021 outcomes provide quantitative evidence of elevated research standards across , with 41% of submitted outputs rated world-leading (4*) and 43% internationally excellent (3*), reflecting broad excellence in peer-reviewed assessments. In specific disciplines, such as business and management studies, the proportion of outputs at 3* or 4* reached 79%, marking an improvement over REF 2014 profiles and attributing gains to competitive pressures. By mandating impact case studies, REF has documented tangible non-academic benefits, with the 2014 exercise yielding 6,975 cases illustrating contributions to the economy, public policy, health, and environment through multidisciplinary collaborations. Analyses of REF 2021 cases, including 746 from Scottish universities, highlight quantifiable societal gains in areas like drug discovery, clinical trials, and sustainable aquaculture, aligning with national frameworks for innovation and UN Sustainable Development Goals, thereby evidencing REF's role in fostering accountable, real-world research translation. These mechanisms counter claims of inefficiency by linking funding to verifiable outcomes, enhancing resource allocation efficiency despite administrative costs.

Global Context and Comparisons

Export of REF Model to Europe and Beyond

The UK's Research Excellence Framework (REF) has exerted significant influence on the development of performance-based research funding systems (PRFS) across Europe, where over 15 countries have implemented such mechanisms since the early 2000s, often drawing initial inspiration from the UK's Research Assessment Exercise (RAE) precursor to REF. However, no European nation has fully replicated REF's emphasis on intensive peer review of outputs, environments, and impacts, primarily due to its high administrative costs—estimated at £2 per £1 of funding allocated in the UK—and scalability challenges in larger or more decentralized systems. Instead, most continental European PRFS prioritize bibliometric indicators, such as citation counts from databases like Scopus or Web of Science, combined with lighter peer review, as seen in systems in Denmark (since 2010) and the Netherlands (via the Standard Evaluation Protocol, updated 2015). This adaptation reflects a causal trade-off: bibliometrics enable cheaper, faster assessments but risk undervaluing non-quantifiable contributions like interdisciplinary or humanities research, a critique echoed in analyses of REF's own limitations. Poland represents the closest European approximation to REF's model, incorporating UK-style impact evaluation since its 2012-2017 assessment cycle, where societal and economic impacts constituted 20% of scoring, alongside 70% for outputs and 10% for potential. Poland's parametric evaluations, conducted every four years since 1991 by the Ministry of Science and Higher Education, allocate funding based on categorized excellence levels, mirroring REF's star ratings, and have involved international panels influenced by UK practices. The system underwent reforms in 2018-2022 to refine impact metrics, addressing translation issues in non-English contexts, but retains REF-like elements to incentivize quality over volume. Outcomes include funding concentration in top institutions, though critics note persistent gaming via selective submissions, similar to UK experiences. Beyond Europe, Hong Kong's Research Assessment Exercise (RAE), initiated in 1993 under the University Grants Committee, directly emulates the UK's RAE/REF framework, with peer-reviewed outputs, environments, and impacts determining research postgraduate places and funding for its eight publicly funded universities. The 2020 RAE aligned closely with REF 2014 (60% outputs, 15% impact, 25% environment), while the upcoming 2026 iteration incorporates REF 2021 reforms, including people, culture, and open research weighting, and employs UK experts as assessors. This adoption stems from historical colonial ties until 1997 and has yielded high excellence ratings—70% of submissions deemed internationally excellent or above in 2014—but faces scrutiny for burdening smaller disciplines. In Australia, the Excellence in Research for Australia (ERA), launched in 2010 by the Australian Research Council, shares REF's quality focus but relies more on metrics (e.g., journal rankings via the FoR codes) than narrative peer review, influencing the UK's 2014 impact emphasis in reverse; ERA evaluations occur biennially, rating 76% of research at or above world standard in 2018, though plans to integrate fuller impact assessments were paused post-2013 government change. These exports highlight REF's global demonstration effect in linking assessment to incentives, yet adaptations underscore contextual variances: smaller systems like Hong Kong's favor REF's granularity for prestige-building, while larger ones dilute it for efficiency. The European Commission's 2022 Coalition of Research Assessment Reform (CoARA), signed by over 500 organizations including UK funders, critiques REF-inspired competitiveness for fostering inequality and metrics fixation, prompting qualitative shifts in REF 2029 guidance. Empirical defenses from adopters, such as Hong Kong's sustained funding growth, counter claims of inefficiency, but no widespread full emulation persists due to verified burdens outweighing benefits in diverse ecosystems.

Contrasts with Alternative National Systems

The UK's Research Excellence Framework (REF) stands out for its reliance on expert to evaluate outputs, societal , and institutional environment, conducted periodically every six to seven years, in contrast to the bibliometric-heavy approaches prevalent in many other national systems. For instance, including , , , and employ quantitative indicators such as publication volumes, journal prestige, and counts to inform annual performance-based (PBRF) allocations, often weighting these metrics at 10-30% of institutional budgets while prioritizing efficiency and transparency over nuanced qualitative judgments. This metric-driven model, exemplified by 's indicator-based system adopted in the early , aims to minimize administrative costs—REF 2021 cost approximately £471 million—but risks overlooking disciplinary differences and non-quantifiable contributions like interdisciplinary or humanities . Australia's Excellence in Research for Australia (ERA), administered by the Australian Research Council since 2010, integrates bibliometrics with limited for biennial evaluations, focusing primarily on output quality ratings derived from journal classifications and data rather than REF's comprehensive of case studies or research environments. While ERA informs funding decisions indirectly through block grants, it lacks REF's direct allocation of around 20% of quality-related research funding based on graded excellence profiles, resulting in less emphasis on demonstrable societal benefits. Italy, similarly, blends peer panels informed by metrics in its national evaluation (VQR), but applies formulaic adjustments annually, diverging from REF's resistance to heavy bibliometric reliance to preserve expert judgment on and rigor. In the United States, no centralized national framework equivalent to REF exists; research quality is gauged through decentralized mechanisms like competitive grant by the (NSF) and (NIH), which awarded $8.3 billion and $32.3 billion respectively in 2023 for project-specific funding, alongside internal university tenure processes and informal metrics in rankings. This grant-centric model fosters via ex-ante proposals but omits REF's ex-post institutional benchmarking, potentially leading to fragmented assessments without the UK's focus on broader ecosystem impacts. Germany's approach, coordinated through the (DFG) and the Excellence Strategy launched in 2019 with €533 million annually for clusters, emphasizes targeted competitive evaluations over REF's universal departmental scrutiny, with state-level funding retaining historical components and minimal PBRF weight (under 5%). France's High Council for Evaluation of Research and Higher Education (HCERES) conducts institutional audits every five years, akin to REF in methodology but narrower in scope, excluding direct impact grading and tying less funding to outcomes. These variations highlight REF's distinctive balance of accountability and selectivity, though critics note its higher burden compared to lighter metric systems.

References

  1. [1]
    What is the REF? - REF 2029
    The Research Excellence Framework (REF) is the UK's system for assessing the excellence of research in UK higher education providers (HEIs).
  2. [2]
    Section 1 – Overview - REF 2029
    Jan 16, 2025 · The Research Excellence Framework (REF) is the system for assessing the excellence of research in UK higher education institutions (HEIs).Key Changes For Ref 2029 · Structure Of Submissions · Equalities Legislation
  3. [3]
    The Research Excellence Framework (REF) - NCCPE
    Nov 15, 2024 · Controversy about the REF​​ The exercise is criticised by others as an expensive bureaucratic imposition, involving as it does 30+ discipline ...
  4. [4]
    Research Excellence Framework - UKRI
    May 12, 2025 · The Research Excellence Framework (REF) is the UK's system for assessing the excellence of research in UK higher education providers (HEPs).
  5. [5]
    Research Excellence Framework - REF 2029
    The REF is the UK's system for assessing the quality of research in UK higher education institutions. It first took place in 2014 and 2021. The next exercise is ...About · Guidance · Publications · Timetable
  6. [6]
    [PDF] Research Evaluation: Past, present and future
    The predecessor of today's Research Excellence Framework (REF), the Research Assessment Exercise (RAE), was developed in the mid-1980s by the University Grants ...
  7. [7]
    Research Quality—Lessons from the UK Research Excellence ... - NIH
    Jul 7, 2022 · Key findings were that the overall quality was 41 per cent world-leading (4*) and 43 per cent internationally excellent (3*) across all ...
  8. [8]
    [PDF] Key facts - REF 2021
    Excellence is well- distributed across the four UK nations and English regions, all of which showed over 80% of submitted research activity to be world-leading ...
  9. [9]
    The Research Excellence Framework: UK Higher Education's great ...
    Jun 17, 2024 · Research assessment processes pioneered in the UK have, therefore, proved to be a great export and versions can be found internationally.
  10. [10]
    [PDF] Review of the Research Excellence Framework: evidence report
    Dec 28, 2018 · This is the report of a review of evidence commissioned in 2016 by the Department of Business, Energy and Industrial Strategy (BEIS).
  11. [11]
    Do we need all the components of the Research Excellence ...
    May 11, 2022 · Since its introduction in its current form in 2014, REF has received many criticisms. For instance, the introduction of the impact element to ...
  12. [12]
    [PDF] Building on success and learning from experience - GOV.UK
    Appendix B: A short history of Research Assessment in the UK ... named the Research Assessment Exercise (RAE), was to inform funding of basic.
  13. [13]
    [PDF] Research Assessment Exercise: a re-assessment - Parliament UK
    Apr 2, 2003 · The next RAE should go ahead in 2008 but no time should be wasted in developing more radical solutions for the scheduled RAE in 2014. Another ...
  14. [14]
    [PDF] 2014 Research Excellence Framework - UK Parliament
    Feb 26, 2015 · The REF will be undertaken by the four UK higher education funding bodies. The exercise will be managed by the REF team based at HEFCE and ...
  15. [15]
    [PDF] Assessment framework and guidance on submissions - REF 2014
    Jul 1, 2011 · This document sets out the general framework for assessment in the. 2014 Research Excellence Framework. (REF) and provides guidance to UK higher ...
  16. [16]
    Assessment framework and guidance on submissions - REF 2014
    Feb 5, 2014 · This document sets out the general framework for assessment in the 2014 Research Excellence Framework (REF) and provides guidance to UK higher education ...
  17. [17]
    [PDF] Decisions on assessing research impact - REF 2014
    However, given that the impact assessment in the 2014 REF will still be developmental, the weighting of impact in the first exercise will be reduced to 20 per.
  18. [18]
    [PDF] Research Excellence Framework 2014: The results
    Dec 18, 2014 · The results of the 2014 REF demonstrate the high quality and enhanced international standing of research conducted in UK HEIs. The results show.
  19. [19]
    Timetable - REF 2014
    Dec 12, 2014 · Census date for staff eligible for selection. 29 November 2013, Closing date for submissions. 31 December 2013, End of publication period (cut ...Missing: timeline | Show results with:timeline
  20. [20]
    Submissions - REF 2014
    Feb 13, 2014 · The deadline for submissions to the REF was 29 November 2013. All 155 higher education institutions that intended to participate in the REF made ...Missing: timeline | Show results with:timeline
  21. [21]
    Submissions Data - REF 2014
    Dec 17, 2014 · 154 institutions made submissions to the REF. The charts below show the number of submissions made and the numbers of staff, outputs and impact ...
  22. [22]
    [PDF] Panel criteria and working methods | REF 2014
    This document sets out the assessment criteria and working methods of the main and sub-panels for the 2014 Research Excellence Framework. The deadline for ...
  23. [23]
    [PDF] Assessing impact submissions for REF 2014: An evaluation - RAND
    Dec 18, 2014 · The impact assessment process undertaken in REF 2014 comprised six main stages, as shown below in Figure S. 1. The assessment of impact was ...Missing: principles | Show results with:principles
  24. [24]
    Assessment criteria and level definitions - REF 2014
    Dec 12, 2014 · The assessment criteria and the definitions of the starred levels that were used in the assessment are set out below.
  25. [25]
    REF 2014: Home
    The Research Excellence Framework (REF) is the new system for assessing the quality of research in UK higher education institutions.Expert panels · Publications · About the REF · FAQs
  26. [26]
    [PDF] Research Excellence Framework 2021: REF Director's report
    committees and an understanding of the REF, including key changes from 2014. 109. The meeting pattern for the main and sub-panels differed across the ...
  27. [27]
  28. [28]
    New REF 2029 policy on Contributions to Knowledge and ...
    Jun 12, 2025 · We have published detailed guidance for the Contributions to Knowledge and Understanding assessment element for REF 2029, outlining important developments.View the CKU guidance · Navigating the complex... · Code of Practice guidance
  29. [29]
    REF 2029 Open access consultation and engagement summary
    Dec 11, 2024 · Outcomes of the consultation. Impact of consultation responses on the REF 2029 open access policy. The funding bodies made clear at the outset ...
  30. [30]
    Expert panels appointed for REF 2029
    Sep 4, 2025 · Research Excellence Framework 2029 logo. Newsnews · Publications ... Units of Assessment (UoAs). The REF team is bringing panel members ...
  31. [31]
    New programme of work during the pause in REF 2029 - UKRI
    Sep 5, 2025 · Research England welcomes the pause in the development of Research Excellence Framework (REF) 2029 for a short period to take stock and ensure ...
  32. [32]
    Units of Assessment - REF 2021
    Research Excellence Framework 2021. Home · Results and submissions · Publications ... Units of Assessment. Submissions to the REF will be made in 34 units of ...
  33. [33]
    Select unit of assessment summary : Results and submissions
    Research Excellence Framework 2021. Results and submissions; Publications and ... 157 UK higher education institutions (HEIs) made submissions in 34 subject-based ...
  34. [34]
    Units of assessment - REF 2029
    Research Excellence Framework 2029 logo. Newsnews · Publicationspublication ... Units of assessment. Units of assessment. Submissions to the REF are made in ...
  35. [35]
    Guidance on submissions (2019/01) - REF 2021
    This document sets out the general framework for assessment in the 2021 Research Excellence Framework (REF) and guidance to UK higher education institutions ...Missing: features | Show results with:features
  36. [36]
    Guidance on codes of practice (2019/03) - REF 2021
    This document sets out the guidance to UK higher education institutions about submitting codes of practice in REF 2021.
  37. [37]
    Guidance - REF 2021
    It includes guidance on procedures, the data that will be required, and the criteria and definitions that will apply. The deadline for submissions is midday, 31 ...
  38. [38]
    Section 8 – Code of Practice guidance - REF 2029
    Institutions should explain how they do or do not use quantitative indicators in decision making, and how this aligns with their broader research assessment ...
  39. [39]
    Submission system data requirements - REF 2021
    This page will enable institutions to prepare data to import to the REF submission system and provides further details of the data required in submissions.
  40. [40]
    Contributions to Knowledge and Understanding (CKU) guidance
    Jun 12, 2025 · This means that firstly, eligible research outputs do not have to be authored by individuals whose contracts contribute to an institution's ...<|separator|>
  41. [41]
    Main and sub-panels - REF 2029
    Sep 4, 2025 · Sub-panels (34 in total) focus on individual subject areas, known as Units of Assessment (UoAs). They are responsible for developing discipline- ...
  42. [42]
    What is the role of expert panels? - REF 2021
    The main panels oversee the assessment, ensuring the assessment criteria and standards are consistently applied. They are responsible for signing-off the ...Missing: peer | Show results with:peer
  43. [43]
    Roles and recruitment of the expert panels (REF 2017/03)
    This document sets out the roles and responsibilities of main panels and sub-panels for the Research Excellence Framework, and invites nominations.Missing: composition selection
  44. [44]
    Guidance on REF 2021 results
    May 12, 2022 · The information on this page provides a guide to the results of the 2021 Research Excellence Framework (REF), published on 12 May 2022.Missing: modifications | Show results with:modifications
  45. [45]
    [PDF] Guidance on Research Output Reviewing and Assessment
    Assessment of research should be based on three criteria – originality, significance and rigour (OSR). Originality. The extent to which the output makes an ...
  46. [46]
    [PDF] Data enhancement and analysis of the REF 2021 Impact Case Studies
    Nov 9, 2023 · This assessment is based on the quality of research outputs, the impact of research beyond academia, and the environment supporting research.
  47. [47]
    Impact case study database FAQs - REF 2021
    All impact case studies will have been submitted jointly, and could stem from research undertaken either collaboratively or at either of the submitting HEIs.Missing: methodology | Show results with:methodology
  48. [48]
    [PDF] REF 2021 Guidance on Submissions Part 3 Section 3: Impact (REF3)
    Case study data requirements (form REF3). 327. Submitting units are required to submit case studies using a generic template. The template, annotated with ...
  49. [49]
    REF2029 – Impact - Research Impact
    All impact case studies submitted in REF 2021 must have met the same eligibility criteria, including the length of the window for underpinning research (1 ...
  50. [50]
    [PDF] Annex G: Impact case study template and guidance
    This annex provides the template for impact case studies, annotated with guidance about the information required in each of its sections.Missing: Excellence Framework
  51. [51]
    [PDF] Annex G: Impact case study template and guidance - SHU Blogs
    This annex provides the template for impact case studies, annotated with guidance about the information required in each of its sections.Missing: Excellence Framework
  52. [52]
    What the REF 2021 Impact Case Studies Can Reveal About UK ...
    Nov 15, 2023 · A new study on the Research Excellence Framework (REF) 2021 impact case studies demonstrates the diversity and significance of UK research impacts both ...Missing: methodology | Show results with:methodology
  53. [53]
    Impact Assessment in the REF - University of Warwick
    Sep 22, 2025 · Following submission to the REF, impact case studies are evaluated by panels made up of academics and research users. Case studies are assessed ...
  54. [54]
    Impact in the REF - Humanities Research Centre, University of York
    Reach and significance are considered as a whole, rather than separately, for the purposes of assessment. Impact case studies are scored on a scale from 4* ( ...
  55. [55]
    Panel criteria and working methods (2019/02) - REF 2021
    This document sets out the assessment criteria and working methods of the main and sub-panels for the Research Excellence Framework 2021. The guidance in this ...
  56. [56]
    Full article: Governing by narratives: REF impact case studies and ...
    Sep 27, 2021 · In REF2014, impact case studies accounted for 20% of the total score but this has been increased to 25% for REF2021 (REF Citation2019). The ...<|separator|>
  57. [57]
    [PDF] Analysis of the REF 2021 Impact Case Studies - RAND
    Nov 9, 2023 · The REF aims to (i) provide accountability for investment in research by demonstrating evidence-based benefit, (ii) provide benchmarking ...
  58. [58]
    Failures in impact evaluation | Research Evaluation - Oxford Academic
    Jul 28, 2025 · 2020: 21). Alongside this in the UK, is REF, an ex-post peer review process to assess research quality and allocate public funding at ...
  59. [59]
    How REF is measured | Birmingham City University
    REF assesses the quality of publicly available outputs from research by academic peer review. In many disciplines, outputs are commonly text-based, including ...
  60. [60]
    [PDF] Guidance for institutions on environment indicators | REF 2021
    Submitted staff should include all staff with a significant responsibility for research, and we expect many institutions to submit 100% of their eligible staff.
  61. [61]
    [PDF] REF 2021 - University of Aberdeen
    The impact weighting is 25% of the assessment. REF 5 – Environment. Environment assesses the 'vitality and sustainability' of the research environment, ...
  62. [62]
    [DOC] Download the Panel criteria and working methods as a ... - REF 2021
    The REF is a process of expert review. Expert sub-panels for each of 34 units of assessment (UOA) will carry out the assessment, working under the leadership ...
  63. [63]
    Evaluating the Research Excellence Framework - Oxford Academic
    Dec 21, 2021 · The UK's Research Excellence Framework (REF) uses a peer-review process to evaluate the research environment, research outputs and non-academic impact of ...Missing: structure | Show results with:structure
  64. [64]
    What is a high-quality research environment? Evidence from the ...
    Feb 13, 2024 · Analysing the research excellence framework​​ Unsurprisingly, the RAE/REF has been scrutinized in terms of (i) critiques of the politics and ...The research excellence... · Topic modelling the research... · General discussion
  65. [65]
    How exactly could REF 2028 evidence research culture? - Wonkhe
    Sep 5, 2023 · The Future Research Assessment Programme's initial decision to raise the weighting of the research environment portion of the forthcoming REF ...
  66. [66]
    REF 2029: metrics for research culture pilot unveiled
    Jan 15, 2025 · According to the new guidance, institutions will gather information in metrics covering five areas – strategy, responsibility, connectivity, ...
  67. [67]
    REF 2021: Results and submissions
    Each overall quality profile shows the proportion of research activity judged by the panels to have met each of the four starred quality levels in steps of 1%.Filter by unit of assessment · Filter by higher education... · Results analysisMissing: scales | Show results with:scales
  68. [68]
    About REF | University of Leeds
    It is calculated by multiplying the percentage in each grade by its rating (4*, 3* etc.), adding them all together and dividing by 100. This results in a ...
  69. [69]
    REF 2021: research excellence framework results
    May 12, 2022 · Overall, 41 per cent of outputs were deemed world-leading (4*) by assessment panels and 43 per cent judged internationally excellent (3*).Missing: findings | Show results with:findings
  70. [70]
    REF 2021: Increased impact weighting helps push up scores
    May 12, 2022 · Overall, the London School of Hygiene and Tropical Medicine (LSHTM) held the highest impact GPA with 3.78, up slightly on 3.74 in 2014.Missing: grade | Show results with:grade<|control11|><|separator|>
  71. [71]
    REF 2021: Times Higher Education's table methodology
    May 12, 2022 · The subject tables rank institutional submissions to each of the 34 units of assessment based on the GPA of the institution's overall quality ...<|separator|>
  72. [72]
    REF 2021: Golden triangle looks set to lose funding share
    May 12, 2022 · On GPA, Imperial was ranked highest, with 3.63, taking top spot from the highly specialised Institute of Cancer Research, which entered only two ...
  73. [73]
    REF 2021: Impact scores | Times Higher Education (THE)
    May 12, 2022 · Impact accounts for 25 per cent of overall scores in the 2021 REF, up from 20 per cent in 2014, when it was assessed for the first time.Missing: grade point average
  74. [74]
    REF 2021 | UCL Research - UCL – University College London
    93 per cent of our research was graded 4* 'world leading' and 3* 'internationally excellent'. 'grade point average' rises from 3.22 to 3.50 (out of 4). Our ...
  75. [75]
    Explainer: QR funding and the REF - UKRI
    Jan 26, 2023 · All UK funding bodies use outcomes of the REF to inform elements of their research funding allocation to universities. Each higher education ...
  76. [76]
    How our funding is calculated - UKRI
    Aug 1, 2023 · We allocate funding on the basis of past quality and performance. Calculations are informed by: the results of the Research Excellence Framework ...
  77. [77]
    Research England grant allocations basis 2025 to 2026 - UKRI
    Jul 30, 2025 · This publication summarises the basis for Research England's formula-based grant allocations to higher education providers (HEPs) in England in ...Research Funding · Knowledge Exchange Funding · Capital Funding
  78. [78]
    REF 2021: Performance, reputation & impact - The World 100
    May 13, 2022 · The London School of Economics (LSE) retained 3rd position (joint with Cambridge) as the highest-ranked W100 member in the GPA table.
  79. [79]
    Short-term incentives of research evaluations: Evidence from the UK ...
    We document incentive effects of the evaluation deadlines in the UK's performance-based research funding system.
  80. [80]
    Does monitoring performance act as an incentive for improving ...
    Feb 9, 2022 · In this article, we study if monitoring performance acts as an incentive for improved research performance by scrutinizing the development of two essential ...
  81. [81]
    Understanding the REF 2021 results | Wonkhe
    May 12, 2022 · The full REF 2021 results have been published and David Kernohan starts the work of unpacking what it all means and bringing out key points ...Missing: findings | Show results with:findings
  82. [82]
    REF is about incentives not just outcomes - Wonkhe
    Jun 28, 2023 · REF 2028 is about incentives not just outcomes and James Coe thinks that could be good for research culture.
  83. [83]
  84. [84]
    [PDF] International comparison of the UK research base, 2022 - GOV.UK
    May 1, 2022 · The field- weighted citation impact of the UK's internationally co-authored publications was 26% higher than the world average (1.51), 41 ...<|control11|><|separator|>
  85. [85]
    New evidence highlights world-leading quality of UK university ...
    May 12, 2022 · The REF was undertaken by the four higher education funding bodies for England, Scotland, Wales and Northern Ireland: Research England, the ...
  86. [86]
    [ARCHIVED CONTENT] Analysis of REF impact - Higher Education Funding Council for England
    **Summary of Societal and Economic Contributions from REF 2014 Impact Analysis:**
  87. [87]
    UK universities hiring 'superstar' professors to boost research rankings
    Oct 5, 2018 · “The impact of the REF is not neutral. It encourages universities to hire superstars because the benefits are quite substantial. And the ...Missing: promotions | Show results with:promotions
  88. [88]
    'Quietly revolutionary' plan would shake up the way U.K. universities ...
    Jun 16, 2023 · It incentivizes universities to hire staff members likely to boost the institution's REF score, she says, which can further disadvantage ...
  89. [89]
    Equality and employment aspects of the UK Research Excellence ...
    Oct 13, 2025 · This article analyses issues of equality and selection arising from the preparation for, and submissions to, the Research Excellence ...
  90. [90]
    Does the REF add any value to UK research? - LSE Blogs
    May 30, 2023 · Our results indicate that the REF 2014 significantly increased UK universities' research output, on average, relative to their counterfactuals.Missing: rates | Show results with:rates
  91. [91]
    The increasing significance of impact within the Research ...
    The UK's Research Excellence Framework (REF) is a performance-based research funding system that was introduced in 2014. REF is a process of expert review ...
  92. [92]
    Full article: 'You certainly don't get promoted for just teaching'
    Apr 2, 2024 · International and UK research suggests that EF academics can experience dissatisfaction with career progression and the perceived value of their work.
  93. [93]
    A Culture of Publish or Perish? The Impact of the REF on Early ...
    Apr 30, 2015 · This article aims to highlight some of the ways in which the REF has impacted upon early career researchers.
  94. [94]
    Clinical academics' experiences of REF2021 - PMC - NIH
    Jan 22, 2024 · The UK Research Excellence Framework (REF) exercise is an assessment of the quality of research carried out in UK Higher Education Institutions ...<|separator|>
  95. [95]
    Understanding perceptions of the Research Excellence Framework ...
    Nov 24, 2021 · The majority of researchers perceive that, overall, the REF has a negative influence on UK researchers, however some academics and ...Missing: effects paths
  96. [96]
    [PDF] REF Accountability Review: Costs, benefits and burden
    The cost and burden of the REF should be the minimum possible to deliver a robust and defensible process. Previous RAEs have been highly cost-effective given ...Missing: inefficiencies | Show results with:inefficiencies
  97. [97]
    Game-playing of the REF makes it an incomplete census
    Dec 19, 2014 · As universities do not have to enter all the eligible staff for the REF, the data is an incomplete census of all research activity and does not compare like- ...Missing: strategies | Show results with:strategies
  98. [98]
    REF director acknowledges risk of universities gaming the system
    Mar 5, 2024 · Rebecca Fairbairn says REF 2029 team is exploring ways to mitigate risk of unfair advantage.
  99. [99]
    Business and Management Studies in the United Kingdom's 2021 ...
    Mar 22, 2023 · The outcome was that the quality of UK research in B&M continues to improve since REF2014. The quality profile for REF2021 had 79% of research ...
  100. [100]
    REF Impact – UKRI
    Mar 31, 2022 · The Research Excellence Framework (REF) was the first exercise to assess the impact of research outside of academia.Missing: process | Show results with:process
  101. [101]
    Impacts of research from Scottish universities: Analysis of the REF 2021 Impact Case Studies
    ### Summary of Evidence from REF 2021 Impact Case Studies on Societal, Economic, and Other Benefits of UK Research
  102. [102]
    Why has no other European country adopted the Research ...
    Jan 16, 2018 · Most European countries have followed the UK's lead in developing performance-based research funding systems (PRFS) for their universities.
  103. [103]
    Poland's impact evaluation gets lost in translation
    Apr 7, 2022 · The Polish evaluation model, however, is very literal. Impact consists of reach and significance, like in the UK, but both have been clearly defined.Missing: adoption | Show results with:adoption
  104. [104]
  105. [105]
    Hong Kong's RAE is a colonial throwback
    Jun 7, 2018 · It is named after – and modelled on – the precursor to the UK's research excellence framework. And with the next iteration, in 2020, set to ...
  106. [106]
    Research Assessment Exercise - University Grants Committee (UGC)
    Oct 18, 2024 · Research Assessment Exercise (RAE) is part of the UGC's commitment to promote world-class research and drive excellence in the UGC-funded universities.Missing: UK | Show results with:UK
  107. [107]
    RAE 2026, REF 2028 and the Evolving Trajectory of the Impact ...
    Jun 20, 2023 · With developments in Hong Kong's RAE 2026 and the UK's REF 2028, we assess some of the evolving trends worldwide for impact assessment.
  108. [108]
  109. [109]
    Performance-based research funding in EU Member States—a ...
    Jun 13, 2018 · This article aims to analyse the extent to which PBRF allocation mechanisms are being implemented in the European Union's (EU) 28 Member States.
  110. [110]
    Unique, but still best practice? The Research Excellence Framework ...
    Aug 15, 2017 · The evaluation is organized at intervals of several years and based on expert panels applying peer review. Bibliometrics may be used to inform ...
  111. [111]
    Metrics, Research Assessment and League Tables: An Australian ...
    Nov 21, 2018 · Australia has a Research Excellence Framework (REF) equivalent: the Excellence for Research Australia (ERA). It is managed by the Australian ...
  112. [112]
    [PDF] International Research Funding Systems: A Comparative Analysis
    Alongside the UK, this report closely examines four countries: Canada, Germany, the Netherlands and South Korea.