Fact-checked by Grok 2 weeks ago

NAPLAN

The National Assessment Program – Literacy and (NAPLAN) is a standardised testing regime administered annually to all in years 3, 5, 7, and 9, assessing core skills in reading, writing, language conventions (, , and ), and to gauge progress toward national educational benchmarks. Introduced in as part of broader reforms to monitor and improve foundational competencies essential for academic advancement, NAPLAN generates comparable data across jurisdictions, enabling identification of underperforming , allocation of resources, and evaluation of teaching effectiveness through on the My School portal. Results, scaled to a common metric since inception, reveal modest gains in primary-level reading and but stagnation or declines in secondary writing and , with jurisdictional variations—such as stronger progress in and —highlighting uneven implementation impacts despite sustained policy focus. While lauded for providing empirical, system-wide insights absent in prior fragmented assessments, NAPLAN has drawn criticism for correlating with heightened anxiety, narrowing toward testable content, and limited causal influence on long-term skill elevation, as evidenced by flat national trajectories amid international middling rankings in comparable surveys like and TIMSS. A 2020 independent review affirmed the value of standardised testing for accountability and trend monitoring but recommended reforms including full online delivery, revised writing prompts to curb formulaic responses, and integration with teacher judgments to mitigate over-reliance on aggregate scores.

Origins and Establishment

Inception and Legislative Basis

The National Assessment Program – Literacy and Numeracy (NAPLAN) originated from federal initiatives to standardize literacy and numeracy evaluations amid evidence of educational shortcomings, including Australia's declining performance in (PISA) results from 2000 to 2006, where average scores in reading literacy fell from 528 to 513 and in mathematical literacy from 533 to 527. These trends, coupled with variability in state-level testing regimes, underscored the need for nationally consistent data to guide policy and resource decisions. The program built on prior state-based systems, such as Western Australia's Monitoring Standards in Education Program (introduced in 1998 for basic skills monitoring) and ' statewide literacy and numeracy testing commenced in 1990, which had highlighted inconsistencies in cross-jurisdictional comparisons. NAPLAN was formally implemented under the Rudd Labor government in 2008, with tests developed in 2007 and first administered nationwide on 15 May 2008 to all students in Years 3, 5, 7, and 9, encompassing over one million participants across and schools. This marked a shift from sample-based national assessments (initiated in 2003 for domains like science literacy) to full-cohort testing, endorsed by the Ministerial Council for , Employment, Training and Youth Affairs to fulfill commitments under the Declaration on National Goals for Schooling (1999) for comparable achievement data. The 2008 Declaration on Educational Goals for Young Australians further reinforced this by prioritizing transparent, evidence-based monitoring of foundational skills to address identified deficits. Its legislative basis derives from rather than standalone federal statute, with participation mandated for government schools via state-territory agreements tied to funding under national partnerships, and voluntary yet near-universal for non-government schools through aligned policies. The Australian Curriculum, Assessment and Reporting Authority Act 2008 (Cth) established the Australian Curriculum, Assessment and Reporting Authority (ACARA) as the administering body, formalizing oversight of NAPLAN from 2010 onward while building on the program's initial intergovernmental framework. This structure enabled NAPLAN data to support broader transparency reforms, including linkages to school performance reporting mechanisms for .

Key Stakeholders and Initial Development

The development of NAPLAN was led by the Ministerial Council for Education, Employment, Training and Youth Affairs (MCEETYA), comprising , state, and territory education ministers, who sought to establish a unified national framework for assessing and skills previously evaluated through inconsistent state and territory tests. This collaborative effort involved input from jurisdictional testing authorities to ensure the assessments reflected agreed-upon essential elements across reading, writing, , , and conventions. The Australian Council for Educational Research () played a central role in the technical aspects, including item development, validation, and coordination of administration and scoring protocols among multiple organizations during the 2007 preparation phase. Tests underwent an iterative refinement process to prioritize measurable indicators of foundational proficiency, with the inaugural nationwide administration occurring in May 2008 for students in years 3, 5, 7, and 9. This initial rollout marked the transition to standardized national benchmarking, supplanting prior fragmented assessments.

Test Components and Administration

Assessed Domains and Year Levels

NAPLAN assessments are administered to students in Years 3, 5, 7, and 9 across four primary domains: reading, writing, language conventions, and . These domains emphasize foundational and skills as defined in the Australian Curriculum, serving as indicators of students' proficiency in core cognitive abilities without extending to specialized or advanced subjects such as or creative . The reading domain evaluates students' capacity to derive meaning from Standard texts through processes, including locating explicit information, making inferences, interpreting and integrating ideas, and critically examining or evaluating content. Texts encompass imaginative (20-60% weighting), informative (20-50%), and persuasive (15-45%) types, drawing on strands (55-80%), (15-25%), and (5-20%) elements; assessments use multiple-choice and technology-enhanced formats like drag-and-drop to gauge text and contextual use. In the writing domain, students respond to a prompt to produce either or persuasive texts in , assessed via 10 criteria covering audience engagement, text structure and , ideas, , , paragraphing, sentence-level grammar, , and accuracy. Each criterion receives a score from 0 to 6 (or equivalent bands scaled by year level), prioritizing verifiable elements of composition such as logical progression and precise expression over subjective creativity. The language conventions domain tests command of , , and as per the Australian Curriculum: English, divided into separate sections for (featuring audio dictation at 55-65% and at 15-25% each) and grammar/ (grammar at 65-75%, at 25-35%). Items are primarily multiple-choice or technology-enhanced, focusing on accuracy in orthographic patterns, syntactic rules, and application without broader interpretive demands. The numeracy domain measures application of mathematical knowledge, skills, procedures, and processes aligned with the Australian Curriculum: Mathematics, spanning content strands of number and algebra (50-60%), measurement and geometry (25-35%), and statistics and probability (10-20%). Proficiency emphases include understanding concepts (25-35%), fluency in procedures (15-25%), problem-solving (25-35%), and reasoning (15-25%), with tests incorporating non-calculator sections (all years) and calculator-allowed portions (Years 7 and 9) to assess operational basics like number operations and geometric reasoning.

Evolution of Test Formats

NAPLAN assessments commenced in 2008 as paper-based tests administered annually to students in Years 3, 5, 7, and 9 across , focusing on and domains through fixed-format items delivered via printed booklets and scanned responses. This format enabled national comparability but relied on manual processing, which introduced delays in scoring and limited scalability for larger cohorts or customized questioning. Efforts to transition to digital delivery began with planning in 2014, targeting full online implementation by 2016, though timelines extended due to infrastructure challenges; a limited occurred in 2018, involving approximately 5% of students who completed tests via computer. The shift progressed gradually, with the transition to online testing deemed complete by 2022, incorporating secure platforms for automated scoring and enhanced data security. By 2025, NAPLAN achieved full administration except for Year 3 writing, which remained paper-based to address disparities in device access and typing proficiency among younger primary students, thereby mitigating potential inequities in test performance linked to technological familiarity. The online format introduced multistage adaptive testing, utilizing to tailor question difficulty based on real-time student responses, commencing in earnest with the 2023 assessments to minimize floor and ceiling effects observed in fixed tests—where low- or high-ability students encountered items misaligned with their proficiency, reducing measurement precision. This adaptation employed algorithms to select subsequent items from calibrated banks, yielding more efficient assessments with fewer questions while improving accuracy for extreme performers, though initial implementations required validation to ensure comparability with prior paper data. Concurrently, testing windows shifted earlier from the traditional May period to March starting in 2023, extending through 2024 and 2025 (e.g., March 12–24 in 2025), to streamline logistics and support prompt via . This adjustment facilitated quicker preliminary results—within four weeks post-testing by 2024—enhancing operational efficiency, albeit with trade-offs such as compressed school preparation timelines that could amplify logistical strains in under-resourced settings.

Scoring, Reporting, and Proficiency Standards

NAPLAN assessments utilize the Rasch measurement model for item calibration and scaling, applied through software such as to estimate student abilities on domain-specific scales via marginal maximum likelihood methods. This approach ensures psychometric objectivity by modeling student proficiency as a latent , with item difficulties calibrated on a and equated across years using common items to maintain scale stability. Scale scores, ranging from approximately 0 to 1000, represent absolute levels of achievement, with reliability coefficients typically exceeding 0.90 for most domains. In 2023, the NAPLAN scales were reset to new baselines, discontinuing prior time-series continuity to better accommodate online adaptive testing and align with revised proficiency standards. Proficiency is classified into four levels—Exceeding, Strong, Developing, and Needs Additional Support—defined by fixed cut-points tied to the 's year-level expectations at the testing period.
  • Exceeding: Achievement well above curriculum expectations, demonstrating advanced application of skills.
  • Strong: Achievement meeting curriculum expectations, with solid foundational proficiency.
  • Developing: Achievement working towards expectations, indicating partial skill mastery.
  • Needs Additional Support: Achievement below expectations, requiring targeted intervention for basic competencies.
These absolute thresholds prioritize causal tracking of skill development over percentile rankings, enabling evaluation of systemic progress in literacy and numeracy mastery independent of annual cohort fluctuations. Results are disseminated through Individual Student Reports (ISRs) provided to schools for parental distribution, detailing scale scores and proficiency levels per domain. School-level aggregates, including average scores and proficiency distributions, are published on the My School portal, while national summaries are released annually—typically in August following May tests—to support accountability without identifying individual students. For instance, 2023 national results were published on August 23, 2023. Plausible values account for measurement error in aggregates, ensuring robust inferences.

Policy Rationale and Objectives

Aims for National Standardization

NAPLAN was introduced in 2008 to establish a uniform framework, replacing disparate state-level testing regimes that had previously led to inconsistencies in evaluating and proficiency across Australian jurisdictions. This shift enabled the first full-cohort testing of students in Years 3, 5, 7, and 9 in and non-government schools, creating comparable data sets to benchmark performance against shared standards rather than fragmented regional metrics. The program's emphasized empirical measurement of core skills in reading, writing, language conventions, and , aiming to highlight disparities in educational outcomes attributable to varying instructional practices or between states and territories. A primary aim was to furnish diagnostic tools for identifying systemic gaps in foundational competencies, facilitating targeted, data-driven responses over reliance on subjective or localized evaluations. By generating nationwide results that track student progress against proficiency levels—refined in 2023 to include explicit minimum benchmarks—NAPLAN supports the detection of underperformance in specific domains, such as or numerical reasoning, where anecdotal reforms had previously proven insufficient to close divides. This approach underscores the causal link between standardized diagnostics and effective policy adjustments, as uniform data reveals patterns not evident in isolated s. The initiative aligned with intergovernmental agreements among federal and state ministers to enforce minimum competencies, drawing on international assessments like TIMSS and PIRLS, which demonstrate that proficiency in basic and correlates strongly with long-term academic success and economic productivity. For instance, evidence from these global studies indicates that early gaps in foundational skills persist without intervention, justifying NAPLAN's role in prioritizing evidence-based standardization to mitigate regional variations in school quality. This framework promotes a federal-state compact for in public systems, reducing tolerance for localized excuses for suboptimal results by tying outcomes to verifiable national metrics.

Role in School Accountability and Funding

NAPLAN results contribute to school accountability through the MySchool website, which publicly reports school-level performance data adjusted via the Index of Community Socio-Educational Advantage (ICSEA) to account for socioeconomic intake, enabling like-for-like comparisons across approximately 9,500 schools. This transparency mechanism, established under the Australian Education Act 2013, supports parental decision-making by highlighting variations in and outcomes, thereby fostering competition and pressuring underperforming institutions to improve without direct performance-linked financial penalties. Although federal funding under the Schooling Resource Standard—derived from the 2011 Gonski Review—prioritizes needs-based allocations via loadings for factors like low , enrollment, and rather than test scores, NAPLAN data informs supplementary targeted interventions for schools showing persistent low performance. For instance, analyses of NAPLAN outcomes have identified high-achieving schools, demonstrating that socioeconomic correlations with scores, while evident (e.g., lower ICSEA quintiles averaging 20-30 points below national means in reading and ), do not predetermine failure, as effective and practices enable outliers to exceed expectations. By facilitating school rankings derived from NAPLAN metrics, such as those compiled annually by independent evaluators incorporating and teacher ratios alongside test data, the system counters inertia through enrollment-driven funding dynamics, where shifts in parental choice—evidenced by increased scrutiny post-2008 MySchool launch—allocate resources toward higher-performing options. Critics note limited overall gains in national scores since inception, attributing this partly to indirect rather than incentive-based ties to funding, yet proponents argue the visibility sustains pressure for resource reallocation toward evidence-based improvements.

Longitudinal Performance Data

National NAPLAN mean scaled scores remained largely stable from 2008 to 2022 across year levels and domains, with no evidence of overall decline in student achievement prior to the 2023 reporting scale changes. For instance, writing scores ranged between 464 and 484 points nationally over this period, while percentages of students below the national minimum standard hovered at or below 10-15% annually in most domains and year levels. This stability persisted despite increasing school enrollments, highlighting absolute skill shortfalls relative to basic proficiency expectations rather than systemic collapse. Year 3 results showed modest improvements in average scores across all domains from 2008 to 2022, including gains in reading where mean scores rose incrementally amid consistent testing. Writing, however, exhibited persistent underperformance, with national means fluctuating due to assessment rubric adjustments in 2011 but remaining below comparable levels in reading and throughout the period. Approximately 85-90% of students met or exceeded the national minimum standard in core domains annually, though upper-band achievement (indicating stronger proficiency) lagged, particularly in writing for higher year levels. State-level variations underscored regional disparities in aggregate performance, with recording consistently higher means (e.g., Year 5 scores averaging 10-20 points above the national baseline) compared to the , where scores trailed by similar margins across years. These differences held steady over the 2008-2022 timeframe, reflecting enduring gaps in outcomes without convergence toward national averages.

Recent Outcomes Post-2023 Reforms

The 2025 NAPLAN results, released on July 30, 2025, revealed stable proportions of students below expected standards across and domains, with roughly one-third classified as needing additional support nationally. This continuity persisted despite the full implementation of post-2023 reforms, including the transition to online adaptive formats, indicating no broad uplift in achievement levels. Participation rates exhibited modest post-COVID recovery, rising to near pre-pandemic levels in most jurisdictions, yet overall proficiency distributions remained consistent with 2023 and 2024 outcomes, where approximately 33% of students fell short of benchmarks. The online adaptive system facilitated expedited reporting—results available within weeks rather than months—and enhanced precision in scaling for low- and high-achieving students through item-response adjustments. However, the recalibrated proficiency standards introduced in 2023 limit longitudinal comparability, as pre-2022 data cannot be directly mapped to the new scales optimized for adaptive testing. Notable but isolated gains appeared in select areas, such as numeracy, where marginal improvements aligned with jurisdiction-specific remedial programs rather than inherent effects of the test format changes. The digital shift has empirically lowered administrative burdens, including fewer transcription errors from paper tests, supporting more reliable data capture without altering underlying performance trends. Overall, these reforms prioritized over transformative impacts on student outcomes, underscoring persistent underachievement unaffected by assessment mechanics.

Demographic and Regional Variations

students consistently underperform non- peers across NAPLAN domains, with 2024 results showing them twice as likely to fall below proficiency benchmarks. In remote schools, where enrollment predominates, nearly 60% of students failed to meet reading benchmarks, compared to rates around 20-30% in metropolitan areas. exacerbates these gaps, as students from lower parental occupation and levels score 10-15 scale points below higher-SES counterparts in literacy and numeracy, persisting even after controlling for school resources. Gender disparities are pronounced in writing, where boys lag behind girls by an average of 20.9 scale score points nationally. In 2025, 53% of Year 9 boys failed to meet writing standards, twice the rate for girls, with similar patterns across Years 3, 5, and 7. Boys perform comparably or slightly better in numeracy but show no closure in literacy gaps over time. Regional variations highlight urban-rural divides, with non-metropolitan students averaging 5-10 points lower in and reading than urban peers in 2024. schools outperform ones by 20-30 points overall, but multivariate analyses reveal family background as the dominant predictor, explaining up to 60% of variance in scores versus 10-15% attributable to school type after SES controls. Following the 2023 shift to online testing, 2024 and 2025 data confirm these demographic and regional gaps remain stable, with no significant narrowing in , gender, or SES differentials. This persistence underscores the primacy of individual skill deficits over institutional factors in explaining outcomes.

Evaluations of Validity and Impact

Research on Reliability and Predictive Power

Psychometric analyses have established high for NAPLAN assessments, with coefficients for Grade 3 reading tests ranging from 0.88 to 0.89 across 2008–2010. These figures indicate strong reliability in measuring and skills, supported by rigorous development processes including national trials, item equating to adjust for yearly variations in difficulty, and administration by trained staff to ensure consistency. ACARA's validation frameworks further confirm alignment with standards, minimizing measurement error through expert review and statistical modeling. Longitudinal studies demonstrate NAPLAN's predictive validity for subsequent academic outcomes. Correlations between Year 9 NAPLAN scores and (PISA) results approximate 0.7 for reading and numeracy, with regression models showing NAPLAN reading scores explaining variance in performance (e.g., PISA reading ≈ 316.56 + 0.54 × NAPLAN reading). Similarly, NAPLAN composites correlate at about 0.7 with (ATAR) and completion metrics, enabling forecasts of post-school success such as university entry. Data linkage with the Longitudinal Surveys of Australian Youth (LSAY) at 98% success rate reinforces these patterns, linking early NAPLAN performance to later employment and education trajectories. Claims of cultural or linguistic in NAPLAN are empirically mitigated by test design focused on core, curriculum-aligned skills with low contextual loading, alongside accommodations for English as a (ESL) students such as accessible item phrasing and jurisdictional supports like extra time or bilingual glossaries where approved. Predictive correlations persist across demographic cohorts in linked datasets, suggesting diagnostic utility over systemic unfairness. Low NAPLAN scores causally precede elevated risks of non-completion and suboptimal post-secondary outcomes, as evidenced by cohort analyses where skill deficits at Years 3 or 9 forecast dropout probabilities independent of socioeconomic controls. This supports NAPLAN's role in identifying for targeted interventions.

Influences on Teaching Practices and Student Outcomes

NAPLAN results have prompted data-informed adjustments in teaching practices, particularly in underperforming schools and regions, where low and scores have correlated with increased adoption of evidence-based methods such as systematic and explicit instruction. For instance, analyses of post-2008 NAPLAN data in revealed initial poor performance relative to other states, leading to targeted interventions emphasizing explicit teaching strategies that prioritize foundational skills, with subsequent localized improvements in reading and proficiency observed in affected cohorts. Similarly, national trends in declining reading scores have supported policy shifts toward phonics-focused curricula, as evidenced by state-level mandates like ' 2024 syllabus overhaul mandating explicit instruction to address disparities highlighted by NAPLAN, resulting in correlated gains for disadvantaged students in basic decoding and comprehension tasks. While some educators report "" practices, including curriculum narrowing to prioritize tested domains like grammar and basic arithmetic, causal evidence from quasi-experimental designs indicates these shifts yield net benefits in core competencies without substantial displacement of broader learning. Teacher surveys across and , involving over 900 respondents, noted perceived reductions in non-tested subjects to allocate time for NAPLAN preparation, yet longitudinal tracking shows sustained or improved performance in aligned basics, with difference-in-differences analyses attributing minimal negative spillovers to overall achievement trajectories. Peer-reviewed reviews of NAPLAN's pedagogical effects confirm that while time reallocation occurs, it often reinforces explicit instruction in essentials, correlating with higher for future academic success rather than eroding holistic development. Regarding student outcomes, empirical studies find no robust evidence of widespread stress-induced performance declines attributable to NAPLAN; instead, persistent gaps align with pre-existing socioeconomic and prior attainment deficits rather than testing artifacts. A 2016 study across 11 independent schools in , surveying students, parents, and teachers, reported minimal emotional distress impacts, with anxiety levels not significantly elevating post-testing and no causal link to reduced learning gains. Complementary pilot data from the same region echoed these findings, showing negligible effects on metrics despite preparation pressures, while broader outcome analyses indicate NAPLAN's diagnostic role helps mitigate deficits through early intervention without exacerbating inequalities via test-related harms.

Controversies and Viewpoints

Arguments in Favor of Standardized Testing

Standardized testing via NAPLAN delivers consistent, nationwide benchmarks for and proficiency, enabling policymakers and educators to pinpoint systemic weaknesses and implement targeted interventions that address causal factors in underperformance, such as ineffective . Analysis of longitudinal NAPLAN data has informed evidence-based practices, including the identification of successful strategies like structured and explicit instruction, which have demonstrably elevated foundational skills in adopting schools. By linking results to public reporting on platforms like MySchool, NAPLAN fosters in government-funded institutions, revealing performance disparities that challenge unsubstantiated rationales for stagnation and bolstering mechanisms for parental choice and resource reallocation toward higher-performing options. This transparency has spurred interventions, such as ' 2024 mandate for explicit instruction in early years, aimed at closing gaps for disadvantaged students through direct skill-building rather than diffuse approaches. In high-stakes accountability contexts akin to those in select U.S. states with rigorous testing regimes, comparable metrics have driven gains in core competencies by enforcing merit-based evaluations over subjective assessments. NAPLAN's diagnostic utility supports by highlighting specific needs for at-risk cohorts, allowing for precise resource deployment—such as intensified support programs—that yield measurable improvements, outperforming equity initiatives reliant on unquantified ideals. For instance, Catholic systems leveraging NAPLAN insights to prioritize explicit reported superior 2024 outcomes, demonstrating how testing-driven elevates baselines across demographics without excusing variances as immutable. This approach counters anti-testing critiques by emphasizing verifiable over anecdotal satisfaction, with online adaptations promising faster loops for .

Criticisms Regarding Equity and Stress

Critics have argued that NAPLAN assessments contain cultural biases that disadvantage students from low socioeconomic status (SES) backgrounds and Indigenous communities, as test content often reflects urban, mainstream cultural references unfamiliar to these groups, thereby limiting their performance relative to higher-SES peers. Persistent achievement gaps in NAPLAN results, such as those between Indigenous and non-Indigenous students across states and territories, have been cited as evidence of these inequities, with Indigenous students consistently scoring lower in literacy and numeracy domains. The shift to online NAPLAN testing, fully implemented by 2023, has drawn claims of widening the , particularly in rural and disadvantaged schools where insufficient devices, unreliable , and limited technical support hinder equitable participation. For example, remote areas with high concentrations of low-SES and students face compounded access barriers, as noted in analyses of regional performance declines linked to infrastructural deficits. Concerns over student center on NAPLAN's high-stakes , with educators reporting elevated anxiety, , and among participants, particularly in years 3, 5, 7, and 9. A 2024 study surveying teachers and principals found that 94% of teachers and 87% of principals attributed increased student and anxiety directly to the testing process, including time-constrained formats and public result reporting. Anecdotal accounts from communities describe narrowed curricula, where instruction prioritizes test-specific skills like multiple-choice strategies over broader learning, potentially exacerbating emotional distress and prompting parental opt-outs in some cases. Some detractors, often from and educational groups, portray NAPLAN as a neoliberal mechanism that privileges narrow, quantifiable metrics at the expense of holistic , arguing it fosters a performative culture ignoring social-emotional growth and diverse learning needs. These views, prevalent in campaigns and , emphasize the tests' role in school rankings and funding pressures as drivers of systemic stress without adequate alternatives for assessing well-being.

Empirical Rebuttals and Causal Analyses

Empirical analyses indicate no robust causal connection between NAPLAN participation and widespread stress epidemics, as perceptions of heightened anxiety often rely on anecdotal reports rather than controlled studies demonstrating direct effects. A review of research highlights that while teachers and principals frequently perceive NAPLAN as a —with surveys reporting 94% of teachers and 87% of principals attributing anxiety to the tests—objective evidence linking the assessment to systemic declines remains scant, with pre-existing youth anxiety trends predating its 2008 inception and correlating more strongly with broader factors like family dynamics and . Similarly, achievement disparities across demographics and regions, often critiqued as exacerbated by standardized testing, trace to socioeconomic and familial influences observable in pre-NAPLAN data from international benchmarks like , where and low-SES gaps exceeded 50 percentage points in equivalents as early as 2000, underscoring that such divides stem from foundational inequities rather than testing-induced causation. Causal scrutiny reveals that substituting NAPLAN's verifiable metrics with unanchored teacher assessments risks and diminished accountability, as evidenced by patterns in Australian private schooling where parental pressures yield upwardly biased evaluations absent external calibration. Standardized tests like NAPLAN counter this by enforcing objective baselines, fostering modest systemic gains: post-implementation analyses show enhanced school-wide coordination on and , with 29% of surveyed educators attributing improved instructional focus to its transparency, without corresponding evidence of net academic harm. Persistent low proficiency—approximately one-third of students in the "needs additional support" or "developing" categories across 2024 domains like reading and —affirms the necessity of intensified testing rigor over dilution, as unverifiable alternatives obscure deficiencies that hinder through inadequate basic skills. This empirical verdict prioritizes causal mechanisms: transparency drives targeted remediation, whereas abolition would entrench unaddressed shortfalls amid stable national results post-2023 reforms.

References

  1. [1]
    NAP - NAPLAN - National Assessment Program
    NAPLAN is an annual assessment for students in Years 3, 5, 7 and 9. It is a nationwide measure through which parents/carers, teachers, schools, education ...Public demonstration site · NAP sample assessments · What's in the tests · General
  2. [2]
    NAPLAN - ACARA
    NAPLAN is a national, consistent measure to determine whether or not students are meeting important educational outcomes.NAPLAN 2012-2016 test papers · NAPLAN 2008-2011 test papers · NAP sample
  3. [3]
    NAP - About - National Assessment Program
    NAPLAN is the only national assessment that all Australian children undertake and provides comparable data about student performance in literacy and numeracy, ...
  4. [4]
    [PDF] NAPLAN Review Final Report
    Aug 14, 2020 · the history and purposes of standardised testing in Australia. It then considers trends in achievement revealed by NAPLAN and international ...
  5. [5]
    NAPLAN National Results - ACARA
    ACARA reports NAPLAN national results for each year level tested (Years 3, 5, 7 and 9) and domain for Australia as a whole, by state/territory as well as by:Missing: empirical | Show results with:empirical
  6. [6]
    NAPLAN tests are valuable and criticisms aren't backed by evidence
    May 13, 2018 · Criticism 1: There has been no improvement in results · Criticism 2: NAPLAN harms students · Criticism 3: Putting results on MySchool harms ...
  7. [7]
  8. [8]
    NAPLAN's origins - AFR
    Jul 31, 2012 · State-wide testing of students in literacy and numeracy was started in 1990 in NSW by premier Nick Greiner and education minister Terry Metherell.
  9. [9]
    [PDF] First national literacy and numeracy tests introduced
    More than one million students in years 3, 5, 7 and 9 participated in the National Assessment. Program – Literacy and Numeracy (NAPLAN) in May 2008. Chris ...
  10. [10]
    [PDF] National Assessment Program – Literacy and Numeracy 2008
    Oct 8, 2008 · Under NAPLAN, annual full-cohort national literacy and numeracy testing is conducted in government and non-government schools across Australia.
  11. [11]
    National Assessment Program - Literacy and Numeracy
    Feb 22, 2024 · NAPLAN is a national assessment that tests students' ability in three domains of literacy—reading, writing and language conventions (spelling, ...
  12. [12]
    [PDF] Departmental Report - Department of Education
    Mar 23, 2015 · It operates under the Australian Curriculum, Assessment and Reporting Authority Act 2008 (the ACARA. Act). Section 44 of the ACARA Act requires ...
  13. [13]
    [PDF] Rudd/Gillard Governments' school funding reforms
    Jan 7, 2014 · 5.1. The recommendations of the Gonski Report were the basis for the. Rudd/Gillard Governments' National Plan for School Improvement (NPSI).
  14. [14]
    NAP - What's in the tests - National Assessment Program
    The NAPLAN numeracy tests measure students' application of mathematics knowledge, skills, procedures and processes as outlined in the Australian Curriculum: ...
  15. [15]
  16. [16]
    [PDF] Literacy and Numeracy (NAPLAN) assessment framework
    Students are assessed on their ability to locate textual information; make inferences; interpret and integrate ideas and information; and examine and evaluate ...
  17. [17]
    NAPLAN student reports 2008–2022 - National Assessment Program
    With the transition to online testing complete in 2022, and the move to an earlier test window from 2023, education ministers agreed to change the way NAPLAN ...
  18. [18]
    How teachers can navigate the shift to online exams
    Oct 28, 2024 · The move to online exams in Australian schools began in the early 2010s, with a key milestone being NAPLAN's transition online in 2018.Missing: timeline | Show results with:timeline
  19. [19]
    NAPLAN testing moves online | KidsNews
    May 9, 2022 · The move to online assessment comes after just one in 20 students used a computer to complete the test in 2018. Australian Curriculum*, ...Missing: timeline | Show results with:timeline
  20. [20]
    NAPLAN: Everything you Need to know for NAPLAN 2025
    Rating 5.0 (1,055) Aug 25, 2024 · What is New for NAPLAN in 2025? · Digital Transition: NAPLAN is now fully online, except for the Year 3 writing test, which remains paper-based.
  21. [21]
    [PDF] NAPLAN 2023 Technical Report - National Assessment Program
    In addition, the adaptive tests allow the possibility of more precise measurement of student achievement, particularly for low- and high-performing students, ...
  22. [22]
    NAP - Understanding online assessment
    NAPLAN online tests are tailored or adaptive. Students at each level start with a similar set of questions. Depending on each student's answer, the next set of ...Technical requirements · Tailored tests · Keyboard shortcutsMissing: 2023 | Show results with:2023
  23. [23]
    NAP - Key dates - National Assessment Program
    NAPLAN practice tests, 2026. Schools can complete practice tests in the assessment platform from Term 4, 2025. For more information, contact your school or your ...
  24. [24]
    [PDF] Schools and students set to benefit from earlier NAPLAN results ...
    Mar 12, 2024 · “Getting the results to schools sooner is a key benefit of having moved the assessment from May to March last year, as well as delivering the ...<|separator|>
  25. [25]
    [PDF] NAPLAN 2024 Technical Report - National Assessment Program
    NAPLAN data provides federal and jurisdictional governments, schools and parents/carers information about whether young Australians are reaching important ...
  26. [26]
    NAPLAN - Proficiency level descriptions
    NAPLAN has four proficiency levels: Exceeding, Strong, Developing, and Needs additional support, based on student achievement at testing.Missing: Rasch | Show results with:Rasch
  27. [27]
    [PDF] One in 10 students 'need additional support' to meet higher NAPLAN ...
    Aug 23, 2023 · Around 10 per cent of students across Australia need additional support in literacy and numeracy to meet higher NAPLAN expectations, according ...<|separator|>
  28. [28]
    NAPLAN score equivalence tables - ACARA
    NAPLAN results are reported using 5 scales: one for each of the domains of reading, writing and numeracy, and 2 for the conventions of language domain (one ...
  29. [29]
    NAPLAN - results reports performance - National Assessment Program
    NAPLAN is not a pass or fail type of test. Individual student performance is measured using 4 proficiency levels for each assessment area and year level.
  30. [30]
    About My School
    There is also no student progress reporting for 2021-2023 due to the resetting of the NAPLAN measurement scale and earlier assessment of students in 2023.
  31. [31]
    CHAPTER 2 - Parliament of Australia
    2.10 ACARA stated in their submission that the overall objective of NAPLAN is to provide education authorities, schools, parents and the local community with ...
  32. [32]
    NAP - National minimum standards
    The NAPLAN assessment scale is divided into 10 bands to record student results in the tests. Band 1 is the lowest band and band 10 is the highest band.
  33. [33]
  34. [34]
    [PDF] Literacy and Numeracy - National Assessment Program
    The first National Assessment Program—Literacy and. Numeracy (NAPLAN) tests were conducted in May 2008 for all Years 3, 5, 7 and 9 students in government ...
  35. [35]
  36. [36]
    [PDF] A study of Australia's top-performing disadvantaged schools - nomanis
    18 top-performing disadvantaged primary schools (12 of which are in Victoria) were identified on the basis of NAPLAN literacy and numeracy test results.
  37. [37]
    On the uses and use of NAPLAN: the hidden effects of test-based ...
    Oct 30, 2023 · Introduced in 2008, NAPLAN is an annual standardised testing regime in which all Australian children in Years 3, 5, 7, and 9 are tested on ...
  38. [38]
    School Rankings - Better Education
    Current and past years' rankings of primary, secondary and high schools in Australia.Victoria School Rankings · Public School Rankings · Catholic School Ranking · Qld
  39. [39]
    NAPLAN has done little to improve student outcomes
    Nov 16, 2017 · One of the goals of NAPLAN was to enhance accountability. By judging all schools on the same measure, comparing schools with similar populations ...
  40. [40]
    NAPLAN Results Haven't Collapsed – But Media Interpretations Have
    Aug 11, 2025 · For Year 5, mean writing achievement has ranged between 464 and 484 nationally between 2008 and 2022 on the NAPLAN scale. For Year 3, the mean ...Naplan Results Haven't... · The Bell Curve · Most Year 9 Students Do Not...
  41. [41]
    NAPLAN National Report (2022) - ACARA
    This page contains NAPLAN results for 2008 - 2022 only. For results from 2023 onwards, please click here. Student achievement is reported as average NAPLAN ...
  42. [42]
    The panic over Naplan is media spin. There is no long-term decline
    Aug 14, 2024 · We need to be cautious about narratives that Australian students' performances in Naplan and other standardised tests are getting worse.
  43. [43]
    Part two: NAPLAN Results Haven't Collapsed - EduResearch Matters
    Aug 12, 2025 · There's plenty of speculation as to why there have been declines in PISA test scores specifically, and there are enough plausible explanations ...<|separator|>
  44. [44]
    Literacy and numeracy skills at school - Australian Bureau of Statistics
    Between 2008 and 2022, average Year 3 scores improved across all domains. Average Year 3 NAPLAN scores for literacy and numeracy, 2008-2022. Graph Table.
  45. [45]
    [PDF] 2022-naplan-national-report.pdf
    This means that the results can be compared and trends analysed in NAPLAN writing data from 2011 onwards, but not for results before then. Therefore, the ...
  46. [46]
    Learning disruptions and academic outcomes - ScienceDirect.com
    Notes: NAPLAN School test score averages for the 2008–2022 period are obtained from ACARA' Myschool website. The NAPLAN was cancelled in 2020 due to COVID- ...
  47. [47]
    NAP - Home
    NAPLAN tests the sorts of skills that are essential for every child to progress through school and life, such as reading, writing, spelling, grammar and ...Naplan · Public demonstration site · The tests · NAP sample assessments
  48. [48]
    NAPLAN results again show one-third of students aren't ... - ABC News
    Jul 29, 2025 · NAPLAN results have again shown roughly one-third of students are not meeting expectations in literacy and numeracy. The results also show ...
  49. [49]
    NAPLAN participation: Post-COVID-19 improvements and remaining ...
    Mar 7, 2025 · Improved school attendance across school days more generally may increase attendance on NAPLAN testing days, facilitating more accurate ...Missing: reforms | Show results with:reforms
  50. [50]
    NAPLAN results again show 1 in 3 students don't meet minimum ...
    Apr 1, 2025 · In both 2023 and 2024, we have seen about 1 in 3 school students fall short of minimum numeracy and literacy expectations and about 1 in 10 needing additional ...
  51. [51]
    2025 NAPLAN Results - Australia's Growing Learning Divide
    Aug 12, 2025 · The newly released 2025 NAPLAN results confirm what we already know - more than 350000 children need greater support in their learning and ...
  52. [52]
    The new NAPLAN results make for grim reading - Grattan Institute
    Aug 14, 2024 · The 2024 NAPLAN results are in, and they paint a sobering picture of Australia's school system. The data reveals that Australia is a long way from the goal of ...Missing: trends empirical
  53. [53]
    NAPLAN 2024: national results commentaries - APO
    Aug 14, 2024 · Results show divisions between genders, Indigenous and non-Indigenous students and between students of varying socioeconomic backgrounds. Almost ...Missing: gender private public
  54. [54]
    53pc of Year 9 boys are failing writing
    Aug 19, 2025 · New analysis of 2025 NAPLAN data shows twice as many boys as girls are seriously failing in writing at every level – years 3, 5, 7 and 9. More ...
  55. [55]
    2024 NAPLAN results show stability despite ongoing challenges
    Aug 14, 2024 · The data also underscores persistent disparities among students from different socio-economic backgrounds, non-urban areas, and Indigenous ...Missing: benchmarks | Show results with:benchmarks
  56. [56]
    The substantiveness of socioeconomic school compositional effects ...
    Nov 23, 2022 · This study examines the effect of school socioeconomic composition on student achievement growth in Australian schooling, and its relationship with academic ...
  57. [57]
    Examining equity in NAPLAN achievement
    May 26, 2022 · This research summary explains the methodology for a data analysis of students' backgrounds and their influence on literacy and numeracy achievement.Explaining Multilevel Models · What We Learnt · Average Differences Between...Missing: type empirical
  58. [58]
    NAPLAN scores show gaps between demographics remain despite ...
    Aug 14, 2024 · Results in the National Assessment Program – Literacy and Numeracy tests (NAPLAN) have held stable and so have entrenched academic gaps.<|separator|>
  59. [59]
    One in three students fail to meet Naplan benchmarks as ...
    Jul 29, 2025 · Results found four in 10 students performed below expectations in grammar and punctuation, indicating they struggled to recognise verbs and ...Missing: empirical | Show results with:empirical
  60. [60]
    Validity of large-scale reading tests: A phenotypic and behaviour ...
    The reported internal reliability of the Grade 3 NAPLAN tests in reading from 2008–2010 was .88–.89 (ACARA, 2013). For the BST and NAPLAN, the number of ...
  61. [61]
    [PDF] Reliability and validity of NAPLAN - ACARA
    Processes have been put in place to ensure that NAPLAN is a valid and reliable measurement of students' literacy and numeracy skills. These processes are ...Missing: predictive | Show results with:predictive
  62. [62]
    [PDF] Linking NAPLAN scores to the Longitudinal Surveys of Australian ...
    NAPLAN reading = 316.56 + 0.54 × PISA reading. The analysis shows that there is a positive correlation between PISA and NAPLAN with the relationship between ...
  63. [63]
    Do Catholic and Independent schools “add-value” to students ...
    Jun 1, 2015 · Year 9 NAPLAN scores in numeracy and reading are correlated at around 0.7. The NAPLAN composite correlates at about 0.7 with TEA and ATAR.<|separator|>
  64. [64]
    Linking NAPLAN scores to LSAY
    Dec 8, 2015 · This report shows that it is technically feasible to link NAPLAN scores to LSAY records; a linking rate of 98% was achieved for consenting LSAY ...Missing: ATAR | Show results with:ATAR
  65. [65]
    [PDF] NAPLAN 2012 Assessment Guidelines - ACARA
    Jan 27, 2011 · Every attempt should be made to ensure that the language of the test items is accessible to all students, including ESL students.
  66. [66]
    [PDF] Is the die cast? Investigating the relationship between prior ... - ERIC
    Demonstrating an association between NAPLAN and ATAR scores is therefore likely to represent an association between NAPLAN scores and later educational success.
  67. [67]
    The process and prediction of school completion: A six-year ...
    This study investigated the process and prediction of school completion among a sample of 9,151 Australian secondary students.
  68. [68]
    Full article: Schools, data and teachers' learning: insights of an ...
    Dec 25, 2023 · Students in Queensland performed poorly on the inaugural NAPLAN in 2008 compared with students from most states in Australia, and particularly ...
  69. [69]
    NSW teachers to embrace 'step by step' explicit instruction method ...
    Jul 23, 2024 · State becomes first to mandate methodology in overhaul educators hope will allow disabled and disadvantaged students catch up to their peers.
  70. [70]
    Opinion: How evidence-based practices like Explicit Teaching set ...
    Oct 24, 2024 · These findings show that explicit instruction is the most effective method for teaching literacy and numeracy to the largest number of students, ...Missing: adoption | Show results with:adoption
  71. [71]
    A preliminary analysis of teacher perceptions of the effects of ...
    Aug 6, 2025 · This paper reports preliminary survey findings of Western Australian and South Australian teacher perceptions of the impact of NAPLAN on curriculum and ...Missing: causal | Show results with:causal
  72. [72]
    No substantive effects of school socioeconomic composition on ...
    Mar 14, 2024 · Therefore, contrary to SP&M's (2022) conclusion, school-SES effects on student achievement in NAPLAN are negligible, whereas school-level prior ...
  73. [73]
    [PDF] NAPLAN, MySchool and Accountability: Teacher perceptions of the ...
    The Federal Government sees it as a key program for promoting quality education in Australia through promoting accountability and transparency (rudd &. Gillard, ...Missing: legislative | Show results with:legislative
  74. [74]
  75. [75]
    Investigating the impact of NAPLAN on student, parent and teacher ...
    Aug 6, 2025 · In a pilot study that surveyed all stakeholder groups across 11 independent schools in Western Australia, we found evidence of a minimal impact from the ...
  76. [76]
  77. [77]
    Evidence-based education needs standardised assessment
    Nov 24, 2017 · Analysis of NAPLAN trends can help to identify policies and practices that may have contributed to improvements. The first Gonski review used ...
  78. [78]
    [PDF] What Works Best 2025 – Evidence guide for excellent schools
    The results showed that explicit teaching contributes to improved learning gains (in terms of NAPLAN scores) in both the short and long term (CESE 2024b; Figure ...
  79. [79]
    Why We Need NAPLAN - The Centre for Independent Studies
    May 13, 2018 · NAPLAN holds governments and schools accountable for literacy and numeracy results, which is important given the significant financial ...
  80. [80]
    Bringing it back home: Why state comparisons are more useful than ...
    This report challenges these conclusions. It focuses on the relevance of comparing US national student performance with average scores in other countries.
  81. [81]
    Explicit teaching success must be replicated in every classroom
    Dec 4, 2024 · The remarkable success of Catholic schools in the ACT and Goulburn in this year's NAPLAN results underscores the urgent need for explicit teaching to be ...
  82. [82]
    [PDF] Cultural Context in Standardized Tests - School of Economics
    Dec 12, 2021 · Abstract. We report results from a field experiment on cultural context in standard- ized tests among 6th- and 8th-grade school students in ...Missing: accommodations | Show results with:accommodations<|control11|><|separator|>
  83. [83]
    [PDF] NAPLAN- A Sociocultural Perspective - The Curtin Teaching Portfolio
    This essay will discuss how NAPLAN has a cultural bias limiting those particularly those from lower socioeconomic groups and minority populations. Some of ...
  84. [84]
    What NAPLAN reveals about education inequality in Australia
    Aug 9, 2025 · It analyses the inequality of achievement between indigenous and non-indigenous students in the States and Territories, with particular ...
  85. [85]
    The pitfalls of digital NAPLAN | The Educator K/12
    Mar 14, 2024 · With an insufficient number of devices, it is a challenge for schools to ensure children have access to the digital NAPLAN tests.” Further ...
  86. [86]
    New NAPLAN platform highlights digital divide: Deakin expert
    Aug 22, 2017 · A transition to digital testing is now being trialled in New South Wales and is set to follow at other trial sites across Australia in coming ...
  87. [87]
    How distance, disadvantage and the digital divide impact country ...
    Aug 14, 2025 · How distance, disadvantage and the digital divide impact country schooling · Distance from city, declining test scores · Staffing crisis in ...Missing: online | Show results with:online
  88. [88]
    Young student's views of NAPLAN: impact on wellbeing through ...
    Sep 29, 2024 · This research examined the impact of NAPLAN testing on the wellbeing of 1,015 students in Years 3 and 5 across 23 school sites within ...
  89. [89]
    NAPLAN can make students more anxious – study | The Educator K/12
    Mar 17, 2025 · NAPLAN testing has been linked to increased stress and anxiety among students. Some perceive it as a high-stakes test, which can lead to test anxiety.Missing: outcomes | Show results with:outcomes
  90. [90]
    NAPLAN :: QTU
    NAPLAN is not good for student wellbeing. Students suffer pressure, anxiety and fatigue; A major report released by the Whitlam Institute found that almost 90 ...
  91. [91]
    [PDF] Moving on from NAPLAN to a new educational assessment system
    The first NAPLAN tests were developed in 2007 and administered under the federal Rudd. Government in 2008. The first NAPLAN report produced by ACARA (2008) ...
  92. [92]
    Australian NAPLAN testing: In what ways is this a 'wicked' problem?
    Oct 17, 2016 · The issue of narrowing of the curriculum to emphasise literacy skills at the expense of arts and physical development skills was noted by the ...The Context: Naplan Testing · The Research · Naplan As A 'wicked' Problem
  93. [93]
    [PDF] Investigating the impact of NAPLAN on student, parent and teacher ...
    A pilot study found minimal impact from NAPLAN on emotional distress, with a small positive association between student and parent distress.Missing: outcomes | Show results with:outcomes<|separator|>
  94. [94]
    what NAPLAN reveals about education inequality in Australia
    Whilst there is an achievement gap between indigenous and non-indigenous students in all States, in the NT it is a staggering fifty percentage points or more ...
  95. [95]
    US private schools often inflate student grades. This could happen ...
    Apr 6, 2020 · Our research showed teachers in private schools are more likely to inflate grades due to pressure from students and parents.Missing: standardized | Show results with:standardized
  96. [96]
    [PDF] The effectiveness of the National Assessment Program - Literacy ...
    Assessment tools such as NAPLAN, which are implemented early in the life of our young children, could either support or hinder future developments in literacy ...
  97. [97]
    NAPLAN results reveal one in three students not meeting basic ...
    Aug 13, 2024 · National NAPLAN scores show one in three Australian school students is performing below literacy and numeracy benchmarks.Naplan Results Reveal One In... · 2024 Naplan Results, By... · Are You Smarter Than A Year...Missing: empirical | Show results with:empirical
  98. [98]
    [PDF] Latest data shows 2024 NAPLAN national results broadly stable
    Aug 14, 2024 · Findings from the NAPLAN National Results released today by the Australian Curriculum, Assessment and. Reporting Authority (ACARA) show that ...Missing: private public