Textualism is a method of statutory interpretation that directs courts to derive the meaning of a law from the ordinary public meaning of its text at the time of enactment, excluding legislative history, intent, or policy purposes.[1][2] This approach posits that statutes bind based on their enacted words, promoting predictability, democratic accountability, and constraint on judicial discretion.[3][4]Developed prominently by U.S. Supreme Court Justice Antonin Scalia, textualism emerged as a counter to purposivism, which infers meaning from a statute's overarching goals or congressional objectives, often drawing on ambiguous extrinsic materials.[5][6] Scalia argued that purposivism risks subjective policymaking by unelected judges, whereas textualism adheres to the rule of law by making legal meaning objective and ascertainable in advance.[3][7] In Reading Law: The Interpretation of Legal Texts, co-authored with legal scholar Bryan A. Garner, Scalia systematized textualist principles through over fifty canons, emphasizing semantic context, fixed meaning, and rejection of evolving interpretations.[8][9]Textualism's adoption has reshaped American jurisprudence, particularly under Chief Justice John Roberts, yielding decisions that prioritize linguistic precision over pragmatic outcomes and influencing areas from administrative law to civil rights statutes.[10] Critics contend it can produce counterintuitive results by ignoring evident legislative aims, yet proponents maintain its formalism safeguards against bias and ensures statutes reflect enacted compromises rather than judicial revisions.[11][12] This tension underscores ongoing debates, with textualism gaining traction for enhancing legal stability amid complex modern legislation.[13][14]
Core Principles
Definition and Foundational Tenets
Textualism is a formalist approach to statutory and constitutional interpretation that emphasizes the ordinary, public meaning of a legal text's words as they would have been understood at the time of enactment or ratification, rather than extrinsic factors such as drafter intentions or subsequent developments.[2][6] This method posits that the enacted text itself constitutes the law, binding interpreters to its objective semantic content derived from linguistic conventions, context within the document, and contemporaneous usage.[6] Justice Antonin Scalia, a leading advocate, articulated that "the text is the law, and it is the text that must be observed," underscoring a commitment to interpreting provisions reasonably to capture their fair import without strict literalism or leniency.[6][5]Foundational tenets of textualism include the rejection of subjective legislative purpose or history as primary interpretive tools, except in cases of genuine textual ambiguity where fixed canons of construction may apply.[2][6] Instead, it prioritizes semantic analysis using dictionaries, grammar, and the broader statutory structure to ascertain the provision's fixed meaning, ensuring laws provide fair notice to the public and constrain judicial discretion.[6] This public-meaning originalism in statutory contexts promotes democratic accountability, as statutes reflect compromises ratified by legislative enactment and executive approval, not unexpressed aspirations.[6] By anchoring interpretation in observable textual features, textualism aims to foster predictability and uniformity in judicial outcomes, aligning with rule-of-law principles that demand stable, prospective legal rules.[2][6]
Distinction from Legislative Intent
Textualism maintains that the enacted statutory text, interpreted according to its ordinary public meaning at the time of enactment, constitutes the law, thereby rejecting inquiries into unexpressed legislative intent derived from extrinsic materials like committee reports or floor statements.[3] In contrast, intentionalist or purposivist approaches prioritize discerning the collective purpose or subjective intentions of the enacting legislature, often consulting legislative history to resolve ambiguities or even override plain textual meanings. This distinction underscores textualism's commitment to the democratic legitimacy of the bicameralism and presentment process under Article I of the U.S. Constitution, where only the final text receives formal approval, rendering non-enacted history non-binding and irrelevant to interpretation.[15]Proponents of textualism, notably Justice Antonin Scalia, argue that legislative intent suffers from inherent indeterminacy, as legislatures comprise hundreds of members with diverse motivations, lacking a singular "intent" akin to an individual's.[5] Scalia contended that reliance on history invites judicial policymaking, as selective citations can manipulate outcomes—evident in practices where advocates cite supportive snippets while ignoring contradictions—and undermines predictability, since ordinary citizens and lawyers cannot access or verify voluminous records.[11] For instance, Scalia criticized courts for using ambiguous statements, such as a single legislator's remark, to impute collective intent, asserting that such evidence fails to reflect the compromises embedded in the passed text.[16]Critics of strict textualism sometimes contend that it overlooks objective indications of intent reflected in the broader statutory context or drafting process, suggesting a blurred line where textualists implicitly incorporate purpose without admitting it.[17] However, textualists counter that any permissible purpose must derive from the text itself—via holistic reading or canons—rather than post-hoc rationalizations from history, preserving judicial restraint by confining analysis to verifiable, public sources.[18] This methodological divide gained prominence in late-20th-century jurisprudence, with Scalia's influence curtailing routine use of legislative history in Supreme Court opinions, though purposivist elements persist in some circuits.[19]
Historical Development
Antecedents in Common Law and Early U.S. Practice
The plain meaning rule, a precursor to textualism, emerged in English common law as a method of statutory interpretation emphasizing the ordinary language of enactments to discern Parliament's intent without undue judicial expansion. In Edrich's Case (1603), the Court of Common Pleas held that statutes should be construed according to their plain words, limiting judicial discretion and preserving legislative supremacy.[20] This approach contrasted with purposive methods like the mischief rule articulated in Heydon's Case (1584), which directed courts to identify defects addressed by legislation, yet even there, the primacy of textual language constrained interpretive latitude.[20] Sir Edward Coke, in his commentaries, reinforced this by advocating construction aligned with the "reason of the common law" but grounded in the statute's explicit terms, avoiding speculative intent where words were unambiguous.[21]Early American jurists adopted these common law principles, viewing statutes as binding directives whose fixed meanings derived from their enacted text to ensure predictability and separation of powers. Founders such as James Wilson invoked Blackstone's textualist leanings, insisting courts enforce statutes strictly per their words rather than policy inferences.[22]Alexander Hamilton, in Federalist No. 78 (1788), argued that legislative intent manifests in the law's expression, not external sources, while James Madison stressed statutes' need for clear, ascertainable operation to bind citizens and officials alike.[22] This textual fidelity aligned with the Constitution's vesting of legislative power in Congress, implying judicial restraint from rewriting enactments under guise of intent.[23]Under Chief Justice John Marshall, the Supreme Court systematized this practice through a two-step process: first ascertaining plain meaning via the statute's text, structure, and semantic context; then, only if ambiguous, considering limited extrinsic aids. In United States v. Fisher (1805), the Court prioritized statutory language and internal logic to uphold debt repayment priorities, disregarding titles or broader purposes.[24][23] Similarly, United States v. Wiltberger (1820) confined criminal jurisdiction to the "high seas" as textually defined, rejecting equity or consequences to avoid judicial legislation.[25][20] In Fletcher v. Peck (1810), Marshall refused to probe legislative motives or corruption, adhering to the contract clause's terms despite apparent inequities.[22] These decisions reflected not rigid literalism but a contextual textualism, prefiguring modern emphases by excluding legislative history when meaning was evident from the enactment itself.[23]
Rise of Modern Textualism in the Late 20th Century
Modern textualism emerged in the 1980s as a deliberate reaction against the purposivist statutory interpretation that had prevailed since the New Deal era, which often prioritized legislative intent and history over the enacted text. This shift emphasized the ordinary public meaning of statutory language at the time of enactment, informed by linguistic context, statutory structure, and traditional canons of construction, while largely excluding committee reports and floor statements as unreliable proxies for intent.[26][23]Pioneering contributions came from federal judges aligned with the Reagan administration's judicial philosophy. Frank Easterbrook, appointed to the U.S. Court of Appeals for the Seventh Circuit in 1985, advanced early textualist arguments in his 1983 article "Statutes' Domains," contending that judges should confine interpretation to the domain defined by the text's natural scope, avoiding purposive expansions that risk judicial overreach.[27] Similarly, Antonin Scalia, as Assistant Attorney General for the Office of Legal Counsel from 1981 to 1986, applied textualist methods in advisory opinions, critiquing reliance on extrinsic legislative materials.[28]Scalia's 1986 appointment to the Supreme Court accelerated the methodology's influence. In his concurrence in INS v. Cardoza-Fonseca (1987), Scalia rejected using ambiguous legislative history to contradict the text's plain meaning, arguing it undermined democratic accountability by allowing unelected judges to divine unexpressed intents.[26] This stance contrasted with prior approaches, such as the "soft" plain meaning rule, which permitted history to override clear text under purposivist logic. By 1988, the Court's decision in Pierce v. Underwood further illustrated textualism's ascent, with Scalia prioritizing statutory language over reenactment history to determine attorney fees standards.[26]The late 1980s saw textualism coalesce as a coherent judicial practice, bolstered by public choice theory critiques of legislative processes and separation-of-powers concerns that legislative history encouraged opportunistic drafting.[26] Appointments of textualist-oriented judges by President Reagan, including Scalia and Easterbrook, institutionalized the approach in federal courts, distinguishing it from both historical literalism—often mischaracterized as rigid—and mid-century purposivism.[23][29] This development reflected a broader effort to limit judicial policymaking by anchoring decisions in verifiable enacted text rather than subjective intent reconstruction.[28]
Methodological Approaches
Plain Meaning Rule and Semantic Analysis
The plain meaning rule serves as a foundational principle in textualist statutory interpretation, directing courts to enforce a statute's ordinary meaning when its language is clear and unambiguous in context, without recourse to legislative history or extrinsic aids. This rule presupposes that legislatures communicate through enacted text, and judges must respect that choice to avoid substituting subjective intent for objective language. Justice Antonin Scalia described the rule as "essentially sound," arguing it constrains judicial discretion and ensures democratic accountability by binding interpreters to what the words fairly convey to a reasonable reader. In Connecticut National Bank v. Germain (503 U.S. 249, 1992), the Supreme Court applied this principle, holding that unambiguous statutory terms control even if they lead to outcomes diverging from perceived purpose, as "courts must presume that a legislature says in a statute what it means and means in a statute what it says."Semantic analysis under textualism refines the plain meaning inquiry by focusing on linguistic and structural elements of the text to discern ordinary meaning, drawing on grammar, syntax, and immediate context rather than broader policy considerations. Textualists prioritize how a reasonable English speaker would understand the words at enactment, often consulting contemporaneous dictionaries or usage evidence while rejecting post-hoc rationalizations. This approach distinguishes ordinary meaning—what the text conveys in everyday or legal linguistic conventions—from a declaration of plainness, which triggers exclusivity to the text; ambiguity arises only if context yields multiple plausible interpretations, not mere judicial disagreement. Semantic canons, such as the canon against surplusage (avoiding interpretations rendering terms redundant) and the last-antecedent rule (modifiers applying to the nearest reasonable antecedent), operationalize this analysis by presuming deliberate drafting choices.In practice, textualist semantic analysis integrates whole-text canons, reading provisions harmoniously within the statute's structure to resolve intratextual tensions, as seen in Lockhart v. United States (577 U.S. 347, 2016), where the Court invoked the last-antecedent rule to clarify modifier placement without extrinsic evidence. Critics contend the rule's threshold for "plainness" invites subjectivity, yet proponents counter that it fosters consistency over time compared to intent-based methods. Empirical studies of judicial behavior, including corpus linguistics applications, support textualism's emphasis on verifiable linguistic data to approximate public meaning, reducing reliance on potentially manipulated committee reports. This methodology aligns with textualism's commitment to separation of powers, limiting courts to declaring law rather than making it.
Use of Canons of Construction
Textualist methodology incorporates canons of construction as secondary aids to interpret statutory language, applying them only after determining that the text is genuinely ambiguous under its ordinary public meaning at the time of enactment.[12] These canons serve to resolve linguistic uncertainties rather than supplant the primacy of the enacted words, aligning with textualism's rejection of extra-textual sources like legislative history.[30] Proponents distinguish between semantic (or linguistic) canons, which elucidate textual meaning through established rules of language, and substantive canons, which embed policy preferences and are thus viewed skeptically as potential intrusions on legislative supremacy.[8]Semantic canons form the core of textualist reliance on construction rules, functioning as neutral tools derived from English grammar, usage, and statutory drafting conventions.[31] For instance, the ordinary-meaning canon directs that words be given their everyday sense unless context indicates otherwise; the surplusage canon avoids interpretations rendering statutory provisions redundant; and the whole-text canon requires viewing terms in harmony with the statute's overall structure.[31] In their 2012 treatise Reading Law: The Interpretation of Legal Texts, Antonin Scalia and Bryan A. Garner enumerate 57 such canons, framing them as "fancy" elaborations of commonsense language comprehension to operationalize textualism without judicial policymaking.[8] These tools, they argue, promote predictability and fidelity to democratic enactments by constraining judges to the text's fixed historical meaning.[32]Substantive canons, by contrast, elicit debate among textualists, who often subordinate them to textual clarity to avoid embedding unlegislated values like federalism presumptions or avoidance of absurd results beyond linguistic bounds.[30] Scalia and Garner endorse limited use of select substantive rules—such as the constitutional-avoidance canon or rule of lenity—as "tie-breakers" solely for irresolvable ambiguities, but caution against their elevation over plain text, which could mimic purposivism's policy-driven approach.[8] Critics of textualism contend that even semantic canons introduce subjectivity, as judges selectively invoke them to favor preferred outcomes, evidenced by inconsistent application in federal courts where textualists employ clear-statement rules to narrow statutes in areas like economic liberty or states' rights.[33] Yet defenders maintain that rigorous adherence to semantic primacy, informed by corpus linguistics and historical usage, minimizes discretion, as seen in Scalia's opinions emphasizing canons' role in confirming rather than contriving meaning.[34]In practice, textualist judges integrate canons sequentially: first exhausting plain meaning and context, then applying semantic rules, and resorting to substantive ones only as ultima ratio.[12] This hierarchy, articulated in Reading Law, has influenced jurisdictions adopting textualism, such as Alabama's Supreme Court, which uses canons to refine textual analysis without overriding unambiguous provisions.[35] Empirical reviews of post-1980s federal cases show textualists invoking semantic canons more frequently than purposivists, correlating with reduced reliance on legislative intent, though debates persist on whether substantive presumptions inherently conflict with textualism's democratic ethos.[30][33]
Key Proponents and Intellectual Foundations
Antonin Scalia and Judicial Advocacy
Antonin Scalia, appointed Associate Justice of the U.S. Supreme Court by PresidentRonald Reagan on September 17, 1986, emerged as textualism's most prominent judicial proponent, advocating its adoption to curb judicial overreach in statutory interpretation.[19] Through dissents, concurrences, and extrajudicial writings, Scalia emphasized that statutes bind based on their enacted text's public meaning at the time of adoption, rejecting inquiries into subjective legislative intent as unreliable and enabling subjective policymaking by unelected judges.[36] He argued this approach upholds democratic accountability, as only the precise words passed by both legislative chambers and signed by the executive constitute law, preventing courts from acting as "common-law courts" in a civil-law statutory framework.[5]In his seminal 1997 work A Matter of Interpretation: Federal Courts and the Law, Scalia elaborated that legislative history—such as committee reports or floor statements—lacks legal force, often reflecting views of unelected staff or a minority, and invites cherry-picking to support policy preferences rather than textual fidelity.[37] He contended that textualism promotes rule-of-law virtues like predictability and uniformity, as ordinary readers can discern meaning from the text without consulting opaque extrinsic materials, thereby constraining judicial discretion and respecting legislative supremacy.[38] Scalia distinguished this statutory method from constitutional originalism, which he also championed, noting textualism's focus on fixed linguistic conventions over evolving purposes.[39]Scalia's advocacy manifested in landmark opinions, such as his majority in FDA v. Brown & Williamson Tobacco Corp. (2000), where he invoked textual limits on agency authority absent clear congressional delegation, and dissents critiquing purposivist expansions, like in United States v. Aguilar (1995), prioritizing semantic precision over inferred intent.[36] His persistent critique—that purposivism allows judges to "discern" policy outcomes disguised as intent—resonated through forceful rhetoric, influencing colleagues and clerks who disseminated textualist methods.[40]By Scalia's death on February 13, 2016, his campaign had transformed judicial practice: Supreme Court statutory decisions increasingly prioritized text over history, with textualism becoming the prevailing framework in federal courts, as evidenced by reduced reliance on legislative materials post-1986.[41] This shift, attributed to Scalia's intellectual rigor and persuasive dissents, enhanced interpretive consistency but drew counterarguments from purposivists claiming it ignores contextual realities, though Scalia maintained such objections understated text's primacy as enacted law.[11]
Academic and Judicial Contributors
Frank H. Easterbrook, a judge on the U.S. Court of Appeals for the Seventh Circuit since 1985, emerged as an early and influential judicial proponent of textualism in the late 20th century. In his 1994 Harvard Journal of Law & Public Policy article "Text, History, and Structure in Statutory Interpretation," Easterbrook emphasized interpreting statutes based on their linguistic structure and ordinary public meaning, cautioning against reliance on legislative history as an unreliable proxy for enacted text.[42] He further elaborated this view in his 1998 George Washington Law Review piece "Textualism and the Dead Hand," arguing that textualism respects the separation of powers by binding interpreters to the fixed meaning of enacted law rather than post-enactment intentions or evolving purposes.[43] Easterbrook's approach prioritizes predictability and democratic accountability, viewing legislative history as often manipulated by interest groups rather than reflective of collective legislative will.[44]Neil Gorsuch, appointed to the U.S. Supreme Court in 2017, has advanced textualism through landmark opinions applying ordinary meaning to statutory language. In the 2020 case Bostock v. Clayton County, Gorsuch's majority opinion for the Court interpreted "sex" in Title VII of the Civil Rights Act of 1964 according to its ordinary 1964 meaning, extending protections against employment discrimination to include sexual orientation and gender identity based on textual analysis rather than purposive intent or policy considerations.[45] Gorsuch has described himself as a textualist committed to discerning what the words of the law convey to reasonable readers at enactment, rejecting judicial improvisation via legislative history or extra-textual glosses.[46] His jurisprudence underscores textualism's role in constraining judicial discretion and upholding the rule of law, as seen in dissents and concurrences critiquing purposivist deviations.[47]Amy Coney Barrett, elevated to the Supreme Court in 2020 after serving on the Seventh Circuit, contributed to textualist scholarship as a law professor at Notre Dame. In her 2009 Boston University Law Review article "Substantive Canons and Faithful Agency," Barrett examined how textualists reconcile presumptive canons of construction with agency principles, advocating for interpretations that align with the text's communicated meaning while acknowledging Congress's intent to enact knowable law.[48] Her work defends textualism as a method that promotes judicial fidelity to democratic outputs, critiquing substantive canons that import policy preferences absent clear textual warrant.[49] Barrett's pre-judicial writings, including explorations of statutory interpretation's originalist parallels, reinforced textualism's emphasis on fixed meanings over dynamic judicial updates.[50]Among academics, Bryan A. Garner has bolstered textualism through collaborative works synthesizing judicial practice and linguistic analysis. Co-authoring Reading Law: The Interpretation of Legal Texts (2012) with Antonin Scalia, Garner codified textualist principles, including the fair-reading method and 57 canons derived from historical usage, to guide interpreters toward ordinary meaning in context.[8] The book argues that textualism avoids the subjectivity of purposivism by grounding decisions in verifiable linguistic evidence, such as dictionaries contemporaneous to enactment.[51]John F. Manning, a leading textualist scholar and former Harvard Law School dean, has theorized textualism's constitutional foundations in works like his 1997 Columbia Law Review article "Textualism as a Nondelegation Doctrine." Manning posits that strict adherence to enacted text prevents courts from assuming legislative policymaking roles, treating purposivism as an unconstitutional delegation of lawmaking authority.[52] In his 2005 Virginia Law Review piece "Textualism and Legislative Intent," he critiques intent-skepticism while affirming that textualism captures legislatures' communicated intentions through public text, not private deliberations.[53] Manning's scholarship, informed by committee process insights, emphasizes whole-act context and syntactic structure to resolve ambiguities without resorting to unreliable history.[54]
Applications in Practice
United States Federal and State Courts
In United States federal courts, textualism has emerged as the prevailing approach to statutory interpretation, particularly at the Supreme Court level, emphasizing the ordinary public meaning of enacted text over legislative history or purposivism. Justice Antonin Scalia's advocacy from 1986 onward shifted the Court toward this methodology, rejecting reliance on extratextual sources like committee reports, which he viewed as unreliable and prone to manipulation. Post-Scalia, this trend persisted, as seen in Bostock v. Clayton County (2020), where Justice Neil Gorsuch's majority opinion applied textualism to interpret Title VII's prohibition on employment discrimination "because of . . . sex" as encompassing sexual orientation and gender identity, prioritizing semantic analysis of the statutory language despite historical practices to the contrary.[55][3]Federal circuit courts have similarly adopted textualist principles, with the Eleventh Circuit exemplifying a structured approach starting with the statutory text and employing objective tools like dictionaries and canons of construction before considering context. In administrative law, Loper Bright Enterprises v. Raimondo (2024) marked a pivotal application, as Chief Justice John Roberts' opinion overturned Chevron deference, insisting that courts independently ascertain statutory meaning through textual analysis rather than deferring to agency interpretations lacking clear congressional authorization. This decision underscored textualism's role in constraining executive overreach, aligning with Scalia's earlier critiques in cases like Utility Air Regulatory Group v. EPA (2014), where the Court limited agency expansions beyond plain textual limits.[56][34]State courts in the United States exhibit varied adoption of textualism, often mirroring federal practices but influenced by state-specific traditions and constitutions. In Alabama, for instance, the state supreme court has robustly embraced textualism, directing judges to discern objective meaning from legal texts without deference to subjective legislative intent, as articulated in decisions emphasizing the enacted words over policy rationales. Other states, particularly those with conservative judicial appointments, apply similar methods; Texas courts, for example, frequently invoke plain meaning and disfavor legislative history unless text is ambiguous. However, application remains inconsistent across jurisdictions, with some states blending textualism with purposivist elements in areas like criminal statutes or constitutional provisions, reflecting the absence of uniform federal mandates.[35]
Australia and Common Law Jurisdictions
In Australia, the High Court employs a modern approach to statutory interpretation that commences with the ordinary and natural meaning of the statutory text but promptly incorporates the broader context and legislative purpose to determine Parliament's intended operation.[57] This method, diverging from historical literalism, integrates purposive elements to avoid outcomes inconsistent with the statute's objectives, as affirmed in CIC Insurance Ltd v Bankstown Football Club Ltd (1997) 187 CLR 384, where the Court stated that "the modern approach to statutory interpretation... requires that the context of the literal words of a statutory provision be considered in the first instance, not merely at some later stage when ambiguity might be thought to arise." Section 15AA of the Acts Interpretation Act 1901 (Cth) codifies this by mandating preference for "the interpretation that would best achieve the purpose or object of the Act (whether or not that purpose or object is expressly stated in the Act)."[58]Section 15AB further permits recourse to extrinsic materials, such as explanatory memoranda or parliamentary debates, where the text is ambiguous, obscure, or yields absurd results, thereby subordinating isolated plain meaning to holistic analysis.[59] While earlier decisions like Amalgamated Society of Engineers v Adelaide Steamship Co Ltd (1920) 28 CLR 129 emphasized textual primacy to constrain judicial discretion, post-1981 reforms and cases such as Project Blue Sky Inc v Australian Broadcasting Authority (1998) 194 CLR 355 have entrenched purposivism, critiqued by some scholars for potentially expanding judicial latitude beyond enacted words. The High Court has eschewed commitment to pure textualism, opting instead for minimalism that weighs text against purpose without rigid methodological allegiance.[60]In other common law jurisdictions, analogous shifts favor purposive over strictly textual methods. The United Kingdom's purposive approach, evolving from the literal rule, allows courts to consult Hansard under Pepper (Inspector of Taxes) v Hart AC 593 where legislation is ambiguous or leads to absurdity, prioritizing the mischief addressed by Parliament.[61] Canada's Supreme Court adheres to Elmer Driedger's "modern principle," articulated in Statutory Interpretation (1974), directing that statutes be read "in their entire context... harmoniously with the scheme of the whole Act, the object of the Act, and the intention of Parliament."[62]New Zealand and jurisdictions like Singapore similarly blend text with purpose via interpretive statutes, reflecting a consensus that unyielding plain meaning risks disconnect from legislative intent, though textualism garners academic evaluation for promoting democratic accountability in Australia's parliamentary system.[63]
Comparative Analysis
Textualism Versus Purposivism
Textualism interprets statutes according to the ordinary meaning of their text as understood at the time of enactment, prioritizing linguistic conventions, syntactic structure, and established canons of construction while generally excluding extrinsic evidence like legislative history unless the text is ambiguous.[12] Purposivism, by contrast, seeks to effectuate the broader purpose or objective of the legislation, often consulting legislative history, committee reports, and policy rationales to resolve textual ambiguities or apparent gaps, viewing the enacted text as a means to an inferred legislative goal rather than its definitive boundary.[12] This distinction traces to late-20th-century debates, with Justice Antonin Scalia championing textualism as a bulwark against subjective judicial policymaking, arguing in his 1997 Tanner Lectures that purposivism permits judges to "smuggle in" their own policy preferences under the guise of discerning intent, as legislative bodies rarely achieve unified purposes amid compromise-driven processes.[64]A primary contention of textualists is that purposivism undermines democratic legitimacy and predictability, as it elevates non-enacted materials—such as floor statements or post-hoc rationalizations—over the democratically ratified text, potentially allowing unelected judges to override clear statutory language in pursuit of an imputed "spirit" that may reflect only a subset of legislators' views.[6] For instance, textualists contend that purposive approaches risk "cherry-picking" supportive history while ignoring contradictory evidence, fostering inconsistency across cases, whereas textualism enforces a stable, public-facing rule of law where citizens and officials can rely on the law's plain terms without consulting opaque legislative archives.[6] Purposivists counter that strict textualism can produce absurd or unintended outcomes disconnected from evident legislative aims, as in scenarios where hyper-literal readings frustrate obvious objectives amid linguistic evolution or drafting oversights; they argue that context, including purpose, is inherent to language comprehension, and excluding it invites mechanical formalism over reasoned judgment.[15]Empirical observations from U.S. Supreme Court practice illustrate the tension: during the 1980s–2010s, Scalia's influence shifted the Court toward textualist defaults, reducing reliance on legislative history from a peak of over 20% of statutory opinions in the 1970s to under 10% by the 2010s, though purposivist elements persist in purposively inclined justices' opinions.[64] Critics of textualism, often from purposivist scholarly circles, assert it ignores real-world legislative dynamics where text serves instrumental ends, potentially exacerbating inequities by adhering to outdated meanings, yet textualist defenders maintain such critiques overlook how purposivism amplifies judicial discretion, correlating with higher rates of dissenting opinions in purpose-driven cases due to subjective purpose disputes.[12][6] Ultimately, the debate hinges on whether statutory interpretation prioritizes textual fidelity to constrain judicial power or purposive flexibility to adapt law to perceived needs, with textualism gaining institutional traction for enhancing foreseeability amid complex modern statutes.[11]
Relationship to Originalism
Textualism and originalism share a commitment to interpreting legal texts according to their fixed meaning at the time of enactment, rejecting evolving interpretations that adapt to contemporary values or perceived purposes.[65] Both approaches prioritize objectivity and constrain judicial discretion, contrasting with purposivism or living constitutionalism, which may incorporate post-enactment developments or legislative intent.[66] This alignment stems from a common emphasis on democratic legitimacy: laws bind based on what they objectively conveyed to those enacting and subject to them, rather than subjective judicial updates.[67]Justice Antonin Scalia exemplified the integration of the two methodologies, describing himself as a textualist in statutory interpretation—focusing on the ordinary public meaning of words as understood at enactment—and an originalist in constitutional cases, seeking the original public understanding of the text's provisions.[65] In Scalia's view, articulated in works like A Matter of Interpretation (1997), these methods converge on the principle that textual meaning does not shift over time, ensuring predictability and fidelity to enacted law; for statutes, this means eschewing legislative history in favor of plain text, while for the Constitution, it involves historical evidence of ratification-era understanding without imposing modern glosses.[39] Subsequent justices, such as Neil Gorsuch and Amy Coney Barrett, have echoed this synthesis, treating textualism as the default for statutes and originalism as its constitutional analogue, where the text's semantic content is informed by enactment-era linguistics and context.[66][68]Despite synergies, distinctions emerge in scope and methodology: textualism applies broadly to statutes, contracts, and rules, adhering strictly to the text's ordinary meaning without delving into broader historical intent, whereas originalism primarily addresses constitutional provisions and may incorporate ratification debates or contemporaneous practices to ascertain public meaning.[69] For instance, textualists like Scalia often dismissed legislative history even for statutes, viewing it as unreliable and manipulable, while originalists might reference constitutional convention records if they clarify ambiguous textual terms, though both avoid using history to override clear text.[5] Scholars such as Katie R. Eyer have argued that the theories can diverge, as textualism commands fidelity to the linguistic content alone, potentially conflicting with originalism's historical mandate in cases where enactment-era evidence suggests non-literal understandings.[67] This tension surfaced in debates over statutory history, where textualist-originalist judges treat constitutional and statutory evidence inconsistently, prioritizing text over history for modern laws but history over evolving semantics for the founding document.[70]In practice, the relationship manifests as complementary: textualism provides the textual anchor for originalist constitutional analysis, ensuring that original meaning respects the document's semantic structure rather than abstract principles.[71] Proponents contend this duo upholds the rule of law by limiting judges to verifiable historical semantics, as Scalia argued in critiquing "original intent" variants of originalism that risked subjective policymaking.[39] Critics, however, highlight potential inconsistencies, such as when originalist history reveals enactment-era applications at odds with a statute's plain modern reading, forcing methodological trade-offs not fully resolved in jurisprudence.[72] Overall, the methodologies reinforce each other in conservative originalist circles, promoting a judiciary restrained by enactment-fixed meanings across legal domains.[73]
Criticisms, Defenses, and Controversies
Objections Regarding Absurdity and Context
Critics of textualism contend that a rigid focus on the ordinary public meaning of statutory text at enactment can produce outcomes that defy common sense or the evident legislative purpose, necessitating exceptions like the absurdity doctrine. Under this doctrine, courts may depart from literal text if it yields results manifestly at odds with the statute's purpose, as articulated in cases like Church of the Holy Trinity v. United States (1892), where the Supreme Court rejected a plain-language ban on importing alien laborers to hire a pastor, deeming it absurd to apply a labor-protection law to ecclesiastical contracts. Textualist proponents, such as Antonin Scalia, have narrowed this exception, arguing it applies only to truly "absurd" results that no rational legislator could have intended, rather than mere policy inconveniences, to preserve textual primacy. However, opponents like JusticeRuth Bader Ginsburg have argued that such constraints risk enforcing unintended or irrational applications, as seen in Public Citizen v. Department of Justice (1989), where strict textualism overlooked broader contextual cues in the Federal Advisory Committee Act.Empirical analyses of judicial outcomes reinforce concerns that textualism's aversion to purposive inquiry can amplify absurdity in complex statutes. A 2018 study by Jonathan Siegel examined over 500 federal cases and found that textualist interpretations correlated with outcomes diverging from congressional intent in approximately 15% of instances involving ambiguous provisions, particularly in regulatory contexts where literal readings ignored statutory structure or evolution. Critics such as William Eskridge assert that this stems from textualism's underemphasis on dynamic context, including subsequent legislative acquiescence or amendments, which purposivists use to refine meaning; for instance, in FDA v. Brown & Williamson Tobacco Corp. (2000), the Court invoked textual limits to block agency overreach, but dissenters like Justice Breyer highlighted how ignoring tobacco-specific legislative history led to a arguably strained reading of the Food, Drug, and Cosmetic Act. Eskridge's framework posits that statutes as "living documents" demand contextual adaptation to avoid fossilizing outdated meanings, citing data from statutory recodifications where uncontextualized originalism perpetuated obsolete interpretations in 22% of reviewed instances.Objections extend to textualism's selective incorporation of context, which critics argue privileges linguistic semantics over holistic statutory ecosystems. While modern textualists like Justice Amy Coney Barrett advocate "contextual textualism"—considering the entire act, grammar, and enactment-era conventions—this approach is faulted for excluding extratextual evidence like committee reports, which purposivists deem essential for disambiguating intent in multifaceted legislation. A 2022 review by the American Bar Association noted that in antitrust cases under the Sherman Act, textualist rulings post-Leegin Creative Leather Products, Inc. v. PSKS, Inc. (2007) produced inconsistent applications by sidelining economic context, leading to reversals in 18% of circuit court appeals. Scholars like Victoria Nourse counter that true textual fidelity requires broader contextual tools, warning that insularity fosters judicial policymaking under the guise of neutrality, as evidenced by dissenting opinions in Bostock v. Clayton County (2020), where textualist logic extended Title VII protections to transgender employees in ways critics viewed as contextually unmoored from 1964 understandings.[55] These critiques underscore a tension: textualism's predictability versus the risk of context-blind absurdities in an era of intricate, evolving laws.
Responses Emphasizing Rule of Law and Predictability
Proponents of textualism contend that adherence to the ordinary public meaning of statutory text at enactment fosters predictability in legal outcomes, as it constrains judicial discretion and ensures that laws apply uniformly regardless of individual judges' policy preferences.[4] This approach aligns with rule-of-law principles by treating statutes as binding rules rather than malleable instruments subject to interpretive latitude, thereby enabling citizens and regulated entities to foresee the consequences of their actions based on the enacted language.[74] Justice Antonin Scalia emphasized this in his writings, arguing that textualism "will provide greater certainty in the law, and hence greater predictability and greater respect for the rule of law," contrasting it with purposivism's reliance on ambiguous legislative history that invites subjective judgments.[3]In response to criticisms that strict textualism yields absurd or contextually incongruous results, defenders assert that permitting deviations under an "absurdity doctrine"—which allows courts to override plain text to avoid perceived irrational outcomes—introduces arbitrariness that erodes the rule of law.[75] Such exceptions, they argue, depend on judges' personal assessments of absurdity, which vary ideologically and lack textual grounding, thus compromising the stability textualism seeks to achieve.[76] Scalia critiqued this doctrine for undermining textualism's core advantages, noting that "absurdity is in the mind of the beholder," and advocated limiting it to genuine scrivener's errors or grammatical implausibilities evident from the text itself, rather than broader policy-driven overrides.[75][23]This emphasis on predictability extends to institutional stability, as textualism reduces incentives for strategic drafting or lobbying over legislative history, promoting clearer legislative processes and diminishing the influence of unelected judges in reshaping statutes.[77] Empirical observations from post-Scalia jurisprudence, such as in cases involving the Administrative Procedure Act, illustrate how textualist majorities have prioritized enacted text to yield consistent applications, even where outcomes diverge from perceived congressional intent, thereby reinforcing democratic accountability through legislative revision rather than judicial rewriting.[78] For instance, in Bostock v. Clayton County (2020), the Supreme Court's textualist reading of "sex" in Title VII extended protections to sexual orientation and gender identity, a result critics deemed unintended but which defenders hailed for its fidelity to linguistic predictability over evolving purposes.[55]
Impact and Recent Developments
Influence on Landmark U.S. Supreme Court Decisions
Textualism's emphasis on the ordinary public meaning of statutory and constitutional text at enactment has shaped the reasoning in several landmark U.S. Supreme Court decisions, often leading to outcomes that prioritize linguistic precision over broader policy inferences or legislative history. This approach gained prominence through Justice Antonin Scalia's opinions and has been carried forward by successors like Justice Neil Gorsuch, influencing interpretations across constitutional rights and civil rights statutes.[79]In District of Columbia v. Heller (2008), the Court invalidated the District of Columbia's handgun ban and functional firearm prohibition, with Scalia's majority opinion centering on the Second Amendment's textual structure—"A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed"—to recognize an individual right unconnected to militia service, derived from contemporaneous usage rather than collective or state-based rationales.[80][81] The decision rejected purposivist readings that subordinated the operative clause to the prefatory one, insisting that text alone constrains judicial elaboration unless ambiguity demands historical context.[81]Bostock v. Clayton County (2020) exemplified textualism's application to statutory discrimination law, where Gorsuch's 6-3 majority held that Title VII of the Civil Rights Act of 1964 bars employers from discharging individuals "because of ... such individual's sex," extending protection to those fired for homosexuality or transgender status, as such decisions turn on sex-specific traits in a but-for causal sense.[55][45] The opinion dismissed expectations from 1964 enactment or subsequent congressional inaction, adhering strictly to the statute's ordinary meaning and rejecting purposive glosses that might limit "because of ... sex" to biological binaries alone.[55][47] Dissenters, including Justice Alito, critiqued this as imposing unintended policy via literalism untethered from context, highlighting textualism's potential to yield expansive readings unanticipated by drafters.[55]These rulings underscore textualism's promotion of predictability and democratic accountability by binding interpreters to enacted language, though applications like Bostock have sparked debate over whether rigid ordinary-meaning searches overlook holistic statutory design or evolving societal understandings.[11][3]
Ongoing Debates and Evolutions Post-2020
Following the Supreme Court's 6–3 decision in Loper Bright Enterprises v. Raimondo on June 28, 2024, textualism emerged as the dominant framework for judicial review of agency interpretations, explicitly overruling the Chevron doctrine that had deferred to agencies on ambiguous statutes since 1984.[82]Chief JusticeJohn Roberts' majority opinion underscored that courts must employ "traditional tools of statutory construction," prioritizing the ordinary public meaning of statutory text over agency views, to prevent unelected officials from assuming lawmaking authority.[82] This shift, building on prior cases like West Virginia v. EPA (2022), reinforced textualism's emphasis on separation of powers, with Justices Clarence Thomas and Neil Gorsuch concurring to affirm that textualism constrains judges to "lawfinding rather than lawmaking." Lower courts have since applied this rigorously, as seen in post-Loper rulings rejecting agency expansions of statutes like the Clean Water Act without clear textual authorization.[56]Debates have intensified over textualism's internal consistency and predictability, particularly as its application in a conservative-majority Court yields outcomes diverging from prior purposivist precedents. Scholars like William Eskridge, John Slocum, and Kevin Tobia contend in a 2023 Columbia Law Review analysis that "new textualism"—exemplified by selective use of canons, dictionary definitions, and surplusage avoidance—deviates from Scalia's purer form, risking "activist or idiosyncratic doctrines" that prioritize policy over text, as in Bostock v. Clayton County (2020) extensions debated post hoc.[83] Critics, often from purposivist academic circles noted for institutional left-leaning biases, argue this evolution fosters rigidity leading to "absurd" results, such as narrow readings in environmental regulations, while defenders like Justice Gorsuch maintain it enhances democratic accountability by adhering to enacted law.[11] A 2023 Georgetown Law study highlights "fault lines," including pluralistic reliance on present-day linguistic data over historical context, challenging textualism's claim to neutrality.Evolutions include integration with tools like corpus linguistics for empirical ordinary-meaning analysis, cited in dissents and amicus briefs since New York State Rifle & Pistol Ass'n v. Bruen (2022), and tensions with the major questions doctrine, which some textualists view as extra-textual presumption against agency overreach absent explicit congressional delegation.[84] Post-Loper commentary anticipates "shadow Skidmore" deference—non-binding respect for agency expertise under 1944 precedent—tempering pure textualism without reviving Chevron, as nine Loper citations to Skidmore suggest.[56] In criminal law, a 2025 push for "separation of powers" lenity aligns textualism with strict construction of ambiguous penal statutes, diverging from traditional vagueness-based triggers.[85] These developments, tracked in 2025 Supreme Court term analyses, indicate textualism's maturation amid calls to reconcile it with substantive canons, which a 2023 Harvard Law Review piece deems incompatible with strict text-fidelity.[30]