Earned run average
Earned run average (ERA) is a key statistic in baseball that measures a pitcher's effectiveness by calculating the average number of earned runs they allow per nine innings pitched, where earned runs are those scored without the aid of defensive errors or passed balls.[1][2] This metric isolates a pitcher's responsibility for runs allowed, excluding those resulting from fielding mistakes, to provide a clearer assessment of their pitching skill.[1] The concept of ERA was developed in the mid-to-late 19th century by statistician and writer Henry Chadwick, who sought a more reliable performance indicator than a pitcher's win-loss record, which could be influenced by team offense and defense.[1] It gained official recognition in Major League Baseball in 1912, though retrospective calculations extend back to 1871.[3] Over time, ERA has become the most universally accepted tool for evaluating pitchers, with historical league averages varying widely—from a low of 2.19 in 1874 to highs above 4.80 in certain eras—reflecting changes in pitching styles, equipment, and rules.[1][4] To compute ERA, the formula is applied as follows: multiply the total earned runs by 9 and divide by the innings pitched, often rounded to two decimal places; for example, if a pitcher allows 20 earned runs over 100 innings, their ERA is 1.80 (9 × 20 / 100 = 1.80).[1] Qualification for official ERA leadership requires a pitcher to have pitched at least one full inning per scheduled league game (typically 162 games in modern MLB, or about 162 innings).[5] While traditional ERA remains central, advanced variants like adjusted ERA (ERA+), which accounts for ballpark effects, and expected ERA (xERA), based on Statcast data, offer refined insights into pitcher performance.[6][7]Fundamentals
Definition
In baseball, earned run average (ERA) is a fundamental statistic that measures the average number of earned runs a pitcher allows per nine innings pitched, providing a key indicator of their effectiveness in preventing opponents from scoring.[1] This metric focuses specifically on runs that result from the pitcher's own actions, such as allowing hits, walks, or home runs, rather than those aided by defensive miscues.[2] Earned runs are distinguished from unearned runs, which occur when a run scores due to errors committed by fielders or passed balls by the catcher, thereby not holding the pitcher directly accountable for the scoring.[2] The term "earned" in ERA etymologically emphasizes runs for which the pitcher bears primary responsibility, excluding those stemming from fielding mistakes that extend innings or advance runners.[8] Since the early 20th century, ERA has served as the standard benchmark for evaluating pitchers' performance across Major League Baseball, with elite pitchers typically posting seasons under 3.00, signifying exceptional run prevention.[1][9] For instance, top performers in recent seasons, like those leading Cy Young Award voting, often achieve ERAs below this threshold, highlighting their dominance.[9]Calculation
The earned run average (ERA) is computed using the formula: \text{ERA} = \left( \frac{\text{Earned Runs Allowed}}{\text{Innings Pitched}} \right) \times 9 This standardizes the statistic to a nine-inning game, as established in Major League Baseball (MLB) scoring rules.[8][10] To calculate ERA, earned runs allowed by the pitcher are tracked over a season, career, or specified period, divided by the total innings pitched, and then multiplied by nine. Innings pitched include both complete and partial innings, with fractions recorded based on outs: one out equals one-third of an inning (0.333), two outs equal two-thirds (0.667), and three outs equal one full inning (1.000). For instance, a pitcher recording four full innings and two additional outs would log 4.667 innings pitched.[8][10] Earned runs are distinguished from unearned runs by the official scorer, who reconstructs the inning as if no errors, passed balls, or defensive interferences had occurred, assuming ordinary effort by the defense and giving the pitcher the benefit of the doubt on base advancements. A run is earned if it scores due to offensive actions such as hits, walks, sacrifice bunts, sacrifice flies, stolen bases, balks, or wild pitches that the defense should have handled without error; otherwise, it is unearned. For example, if a batter reaches base on a fielding error and later scores on a subsequent hit, the official scorer replays the inning without the error—if the runner would not have reached base or advanced sufficiently to score, the run is unearned and excluded from ERA.[2][11] The calculation applies identically to starting and relief pitchers, though relief appearances are often shorter and contextualized by inherited runners from prior pitchers (with earned runs charged based on the reconstructing scorer's judgment of responsibility, as detailed in other sections).[8][11] As an illustrative example, consider a pitcher who allows 50 earned runs over 200 innings pitched: divide 50 by 200 to get 0.25, then multiply by 9 to yield an ERA of 2.25.[8]History
Origins
The concept of earned run average (ERA) traces its roots to the late 19th century, when statistician Henry Chadwick, known as the "father of baseball," introduced the distinction between earned and unearned runs to better assess a pitcher's performance independent of defensive errors.[1] Chadwick's innovation aimed to isolate runs scored due to hits and walks—factors directly attributable to the pitcher—from those resulting from fielding miscues, addressing the shortcomings of prevailing metrics like total runs allowed per game, which did not differentiate pitcher responsibility.[12] This foundational idea emerged amid baseball's growing emphasis on statistical evaluation, as win-loss records often reflected team offense and defense more than individual pitching skill.[13] The official adoption of ERA as a tabulated statistic occurred in 1912, when National League secretary John Heydler implemented it as a formal measure, initially dubbed "Heydler's statistic."[14] The American League followed suit in 1913, marking the first season both major leagues recorded ERA officially, with calculations performed manually from box scores by scorers and league officials.[14] This standardization provided a more precise tool for evaluating pitchers, as it normalized earned runs allowed over nine innings, offering a defensive-independent benchmark that win-loss records lacked.[1] Early motivations for ERA's introduction highlighted the limitations of win-loss tallies, which could undervalue elite pitchers on weak teams; for instance, Washington Senators ace Walter Johnson posted a 1.36 ERA in 1910 with a 25-17 record, underscoring his dominance despite the team's seventh-place finish and limited run support.[15] By focusing on earned runs—those scored without the aid of errors or passed balls—ERA established a clearer standard for pitcher effectiveness, supplanting cruder pre-ERA approaches like overall runs per game and enabling fairer comparisons across players and seasons.[1]Evolution Across Eras
The dead-ball era, spanning roughly from 1904 to 1919, featured notably low earned run averages, with the league-wide average hovering around 2.82. This period was characterized by subdued offensive output due to the use of softer, less lively baseballs that were often discolored and scuffed during play, combined with strategic emphases on small ball tactics like bunting and hit-and-run plays rather than power hitting. Pitchers benefited from legal tricks such as the spitball, contributing to sub-2.00 ERAs for elite performers and an overall environment where scoring was limited.[4] The transition to the live-ball era beginning in 1920 marked a dramatic shift, elevating league-average ERAs to approximately 4.05 through the 1920s. Key factors included the introduction of a more resilient, cork-centered ball wound with Australian yarn, the gradual outlawing of the spitball starting in 1920, and a revolutionary focus on power hitting exemplified by Babe Ruth's home run surge, which encouraged teams to prioritize slugging over contact-oriented strategies. These changes persisted into the 1930s, with averages around 4.18, as raised seams on the ball further aided grip and carry, sustaining higher run production despite the Great Depression's economic constraints.[4][16] Post-World War II developments from the late 1940s through the 1960s introduced further fluctuations, with league ERAs averaging about 3.98 in the 1950s amid franchise relocations and the proliferation of night games, which slightly favored hitters by improving visibility of the ball under artificial lights. Expansion in 1961 and 1962 diluted pitching talent across more teams, contributing to this stability at higher levels than the pre-war era, while the 1960s saw a dip to 3.39 through 1968 due to an expansive strike zone and dominant pitchers like Bob Gibson. In response to the "Year of the Pitcher," Major League Baseball lowered the mound from 15 to 10 inches and tightened the strike zone for 1969, boosting offense and raising ERAs into the 1970s around 3.71 as free agency—introduced after the 1975 Messersmith-McNally ruling—fostered pitcher specialization and deeper bullpens.[17][4][18] In the modern era from the 1990s to 2025, ERAs peaked at 4.70 in 1999 amid widespread performance-enhancing drug use, which amplified home run rates and overall scoring during the "steroids era," before declining to around 4.29 for the decade as testing ramped up post-2003. Analytics-driven approaches in the 2010s emphasized strikeouts and defensive shifts, stabilizing averages near 4.00, though expansion of interleague play and universal designated hitter rules in 2022 added minor offensive boosts. The 2023 pitch clock implementation shortened games by about 24 minutes on average but correlated with a rise in ERAs to 4.33 that year, attributed to rushed pitching routines increasing walks and hits, before settling to 4.07 in 2024 and 4.15 in 2025 as players adapted.[17][19][20]Infinite and Undefined Cases
In baseball statistics, an infinite earned run average (ERA) arises when a pitcher surrenders one or more earned runs while recording zero innings pitched, resulting in division by zero in the standard ERA formula of (earned runs × 9) / innings pitched.[1] This scenario typically occurs in abbreviated appearances, such as when a pitcher enters the game but is immediately removed due to injury, ejection, or ineffective performance before retiring any batter, allowing runners to score on subsequent plays.[21] For instance, position players pitching in lopsided games often face this outcome, as they may walk or hit batters who then score without any outs recorded.[22] Undefined ERA cases are closely related and stem from the same mathematical issue of zero innings pitched, though the term is sometimes used interchangeably with infinite values in statistical contexts. To mitigate such divisions by zero, Major League Baseball (MLB) imposes minimum inning requirements for ERA qualification and listing on leaderboards; under current guidelines, a pitcher must accumulate at least one inning pitched per team game (typically 162 innings for a full season) to qualify for official ERA titles.[23] However, individual game or seasonal ERAs for pitchers with zero innings are still computed and noted as infinite in official records, without affecting qualified rankings.[5] Historically, before formalized qualification rules in the early 20th century, such cases were more prevalent and less standardized; for example, in 1914, several pitchers recorded appearances with earned runs and no outs, leading to undefined or infinite ERAs in pre-modern scoring.[21] MLB's official scoring manual, as updated through 2025, handles these by excluding non-qualifying pitchers from averages while preserving the infinite notation for completeness in player logs, particularly for rare instances involving non-pitchers or emergency relief.[23] These edge cases underscore the ERA metric's limitations in evaluating pitchers with extremely limited exposure, as the statistic assumes a denominator greater than zero and prioritizes sustained performance over one-off anomalies. They do not impact qualified leaderboards but serve as a reminder of the metric's reliance on innings pitched for meaningful interpretation.[1]External Influences
Pitcher Roles
Starting pitchers bear the primary responsibility of setting the tone for games, often facing the opposing lineup three or more times in a single outing. This repeated exposure allows batters to gain familiarity with the pitcher's repertoire, while fatigue accumulates over longer stints, often totaling 150-180 innings per season for durable starters in the modern era. Consequently, starting pitchers' earned run averages are generally higher, with league averages about 0.50 runs above those of relievers, reflecting the demanding nature of their role.[10] In contrast, relief pitchers specialize in shorter appearances of 1 to 3 innings, frequently deployed in high-leverage situations such as late innings with runners on base. Their fresher arms enable maximum velocity and sharper command of specialized pitches, contributing to notably lower ERAs overall. Elite closers, who secure the final outs, often post sub-2.50 ERAs due to these optimized conditions and the pressure-packed but brief contexts of their work.[1] These role distinctions have evolved historically, particularly with the bullpen's expansion after the 1970s, when teams shifted toward more specialized relief usage and deeper pitching staffs, amplifying the ERA gap between starters and relievers. Metrics like Fielding Independent Pitching (FIP), which emphasize strikeouts, walks, hit-by-pitches, and home runs—events largely under a pitcher's control—tend to align more closely with reliever ERAs by mitigating the impact of defensive and situational factors. For instance, Hall of Fame starter Cy Young compiled a career ERA of 2.63 across over 7,000 innings, while closer Mariano Rivera achieved 2.21 in high-stakes relief roles, underscoring how pitcher responsibilities shape performance outcomes.[24][25][26][27]Rule Variations
The designated hitter (DH) rule, introduced in the American League (AL) on January 11, 1973, allows a player to bat in place of the pitcher, thereby removing weak-hitting pitchers from the offensive lineup and boosting overall run production.[28] This change was implemented to counteract declining offense in the late 1960s and early 1970s, when league-wide scoring had dropped significantly.[29] By substituting pitchers—who typically hit below .200—with more capable hitters, the rule increased AL runs per game by approximately 10-15% compared to the National League (NL), directly elevating earned run averages (ERAs) as more earned runs crossed the plate.[30] For instance, the AL's average ERA rose from 3.06 in the strike-shortened 1972 season to 3.82 in 1973, reflecting an initial surge of about 25% attributable in part to the DH, though other factors like expanded strike zones contributed.[31] The DH created a persistent disparity between the leagues until the NL adopted it universally in 2022 as part of the collective bargaining agreement, narrowing the AL-NL gap in offensive output and ERA.[32] Pre-DH, NL ERAs benefited from pitchers batting, which suppressed scoring; post-adoption, NL ERAs adjusted upward to align more closely with AL figures, with studies estimating an approximate 0.20-0.30 run increase per nine innings for NL pitchers due to facing stronger lineups without pitcher at-bats.[33] Following universal implementation, MLB-wide ERA climbed from 3.96 in 2022 to 4.33 in 2023 (a roughly 9.6% rise), then decreased to 4.07 in 2024 before rising slightly to 4.15 in 2025, linked to enhanced offense across both leagues.[17] Other rule changes have also influenced ERA calculations and outcomes. In 1969, MLB lowered the pitching mound from 15 inches to 10 inches and shrank the strike zone to favor hitters after the pitcher-dominant 1968 "Year of the Pitcher," where the league ERA hit 2.98; this adjustment increased run scoring and elevated ERAs by an estimated 0.20-0.50 runs in subsequent seasons, restoring balance.[18] Interleague play, starting in 1997, further exposed pitchers to DH effects based on the home team's league: NL pitchers facing AL lineups without batting duties saw their ERAs rise by about 0.15-0.25 runs on average in road games, as they contended with ninth-inning DHs rather than pitchers. Internationally, variations like the absence of the DH in early World Baseball Classics (WBCs) prior to 2023 led to lower tournament ERAs compared to MLB averages, as pitchers were required to bat, reducing offensive efficiency and run totals.[34] For example, WBC ERAs across 2006-2017 editions averaged around 4.07-4.28, below MLB's typical 4.25-4.45, partly due to this rule suppressing scoring despite elite talent.[35] The 2023 WBC's adoption of the DH aligned its dynamics more closely with MLB, eliminating this distinction.Ballpark and Environmental Effects
Ballpark dimensions and altitude significantly influence earned run averages (ERAs) by altering the distance balls travel and the likelihood of hits and home runs. High-altitude venues like Coors Field in Denver, situated over 5,000 feet above sea level, reduce air density, allowing fly balls to carry farther and increasing offensive output by approximately 15-20% compared to average parks.[36] This effect inflates pitchers' ERAs, as more earned runs result from the enhanced ball flight rather than individual performance. Conversely, pitcher-friendly parks such as Oracle Park in San Francisco suppress offense due to deep outfield dimensions and cooler marine air, reducing runs scored by about 8-10% and lowering ERAs for home pitchers.[37] To account for these discrepancies, metrics like ERA+ provide a park-adjusted measure of pitching effectiveness, normalizing a pitcher's ERA relative to league averages and venue-specific factors to better evaluate performance across different stadiums.[38] Environmental conditions further modulate ERAs beyond structural elements. High humidity decreases air density, enabling baseballs to travel farther with less resistance, which favors hitters and elevates ERAs by increasing the frequency of extra-base hits and home runs.[39] Wind direction and speed also play a role; tailwinds boost fly ball distance, while headwinds suppress it, leading to ERA fluctuations of up to 0.5 runs per game in extreme cases.[40] The opening of the Houston Astrodome in 1965 marked a pivotal shift by introducing the first fully enclosed, climate-controlled stadium, eliminating wind and variable weather effects that previously caused inconsistent scoring environments.[41] Recent data illustrates the magnitude of these influences, with 2024 venue-specific ERAs varying by up to 0.8-1.0 points across MLB parks due to combined ballpark and environmental factors. As of 2025, venue-specific ERAs continued to vary by up to 1.0-1.2 points, with Coors Field pitchers posting around 5.2 compared to the league average of 4.15. For instance, Dodger Stadium's marine layer—a cool, moist fog rolling in from the Pacific—often reduces home run distances by dampening ball carry, contributing to lower ERAs for Dodgers pitchers during night games.[42][43]Advanced Analysis
Sabermetric Views
Sabermetricians have long identified several core limitations in earned run average (ERA) as a measure of pitcher performance. Primarily, ERA seeks to isolate a pitcher's responsibility by excluding unearned runs resulting from defensive errors or passed balls, yet the distinction between earned and unearned runs relies on subjective judgments about errors and passed balls, which do not accurately reflect a pitcher's individual responsibility.[44] Additionally, ERA is influenced by luck in balls in play, as captured by batting average on balls in play (BABIP), where outcomes depend heavily on factors like defensive shifts and random variance rather than pitcher skill alone.[45] Furthermore, ERA overvalues the sequencing of events, such as the clustering of hits or the timing of home runs relative to baserunners, which introduces noise unrelated to a pitcher's controllable inputs.[46] The historical evolution of sabermetric critiques of ERA traces back to the 1980s, when pioneers like Bill James began challenging traditional metrics for their reliance on subjective and context-dependent elements, advocating instead for statistics that isolate pitcher talent from external variables.[47] James's work in his Baseball Abstracts highlighted how ERA masked true skill by incorporating fielding and luck, laying the groundwork for more refined approaches that gained traction through the Society for American Baseball Research (SABR). In modern sabermetrics, this critique has led to alternatives like Fielding Independent Pitching (FIP), which focuses solely on outcomes directly under a pitcher's control—walks, strikeouts, and home runs—while normalizing for league-average results on balls in play. FIP demonstrates stronger correlation with future performance than ERA, serving as a more reliable predictor of a pitcher's talent level over time.[48][49] Building on FIP, contemporary tools like expected ERA (xERA) incorporate Statcast data to track pitch outcomes, estimating runs based on expected results from exit velocity, launch angle, and sprint speed rather than actual events, thus mitigating luck and sequencing biases.[50] By the 2020s, xERA has become a standard for evaluating sustainability, often revealing discrepancies that ERA overlooks. In 2025, integrations of AI models have further advanced ERA predictions by analyzing granular metrics like exit velocity and launch angle to forecast run prevention, enabling teams to identify unsustainable performances more precisely.[51] For instance, pitchers such as Cal Quantrill in recent seasons have posted low ERAs supported by favorable BABIP and sequencing but higher FIPs, signaling potential regression as their luck normalizes.[52]Comparisons to Other Metrics
Earned run average (ERA) provides a comprehensive measure of a pitcher's run prevention but differs from walks plus hits per inning pitched (WHIP) by incorporating the outcomes of those baserunners, such as home runs and runs scored through sequencing and defense.[53] WHIP isolates the pitcher's ability to prevent baserunners via hits and walks, making it less influenced by factors like ballpark effects or defensive play, whereas ERA reflects a broader scope including how those runners convert into earned runs.[53] For instance, a pitcher with a relatively high WHIP may still post a low ERA if the contact allowed results in weak hits or if the defense excels at stranding runners, highlighting WHIP's focus on contact management over total run allowance.[54] In contrast to wins and losses, which are heavily team-dependent metrics influenced by offensive support, bullpen performance, and game situations, ERA serves as a more direct indicator of a pitcher's individual skill in preventing runs.[55] Pitcher wins exhibit a low correlation with personal performance due to external factors like run support, while ERA demonstrates a stronger correlation with overall run prevention effectiveness.[55] This makes ERA a preferable skill-based evaluation tool, as wins can misrepresent a pitcher's contributions in games where team context overrides individual control.[55] OPS allowed functions as an offensive counterpart to ERA, quantifying the on-base plus slugging performance of batters facing a pitcher, but it overlooks the sequencing of events that ERA captures through actual runs scored.[38] While OPS allowed effectively measures the quality of contact permitted, ignoring baserunner advancement and scoring opportunities, ERA accounts for the full run-scoring process, providing a more holistic view of pitching impact despite its susceptibility to luck in run distribution.[38] In modern analysis as of 2025, ERA is frequently paired with skill-interactive ERA (SIERA) to address its limitations in predicting future performance by emphasizing pitcher-controlled elements like strikeouts, walks, and home runs over batted-ball outcomes.[56] SIERA offers deeper insights into sustainable run prevention, complementing ERA's descriptive role; for example, Jacob deGrom's elite career ERA of 2.57 aligns closely with his low opponents' OPS of approximately .580, underscoring how such metrics reinforce evaluations of dominant pitchers.[57][58]Records
All-Time MLB Leaders
The all-time leaders in career earned run average (ERA) in Major League Baseball are determined using a minimum qualifier of 1,000 innings pitched, emphasizing pitchers who maintained exceptional control and effectiveness over substantial workloads. These rankings highlight the dominance of early baseball eras, particularly the dead-ball period before 1920, when lower-scoring games resulted from factors like the condition of the baseball, higher error rates in fielding, and different strategic approaches to pitching. As a result, the top positions are occupied almost exclusively by pitchers active in the late 19th and early 20th centuries, with ERAs far below modern benchmarks, which typically hover around 3.00 to 4.00 due to evolved rules, equipment, and offensive emphases.[59] The following table presents the top 20 all-time MLB career ERA leaders, including their ERA, years active, innings pitched, and primary teams:| Rank | Player | ERA | Years | IP | Primary Teams |
|---|---|---|---|---|---|
| 1 | Ed Walsh | 1.82 | 1904-1917 | 2964.1 | Chicago White Sox |
| 2 | Addie Joss | 1.89 | 1902-1910 | 2327.0 | Cleveland Naps |
| 3 | Jim Devlin | 1.90 | 1875-1879 | 1405.0 | Louisville Grays |
| 4 | Jack Pfiester | 2.02 | 1903-1911 | 1067.1 | Chicago Cubs |
| 5 | Smoky Joe Wood | 2.03 | 1908-1922 | 1434.1 | Boston Red Sox, Cleveland Indians |
| 6 | Mordecai Brown | 2.06 | 1903-1916 | 3172.1 | Chicago Cubs |
| 7 | John Ward | 2.10 | 1878-1894 | 2469.2 | New York Giants, others |
| 8 | Christy Mathewson | 2.13 | 1900-1916 | 4788.2 | New York Giants |
| 8 | Al Spalding | 2.13 | 1871-1878 | 2886.1 | Boston Red Stockings |
| 10 | Tommy Bond | 2.14 | 1874-1884 | 3628.2 | Boston Red Caps, others |
| 11 | Rube Waddell | 2.16 | 1897-1910 | 2961.1 | Philadelphia Athletics |
| 12 | Walter Johnson | 2.17 | 1907-1927 | 5914.1 | Washington Senators |
| 13 | Mariano Rivera | 2.21 | 1995-2013 | 1283.2 | New York Yankees |
| 14 | Jake Weimer | 2.23 | 1903-1909 | 1472.2 | Chicago Cubs |
| 15 | Orval Overall | 2.26 | 1905-1913 | 1535.1 | Chicago Cubs |
| 16 | Will White | 2.28 | 1877-1886 | 3542.2 | Cincinnati Reds, others |
| 17 | Babe Ruth | 2.28 | 1914-1935 | 1221.1 | Boston Red Sox, New York Yankees |
| 18 | Ed Reulbach | 2.28 | 1905-1917 | 2632.1 | Chicago Cubs |
| 19 | Jim Scott | 2.30 | 1909-1917 | 1892.0 | Chicago White Sox |
| 20 | Reb Russell | 2.33 | 1909-1913 | 1291.2 | Chicago White Sox |
Live-Ball Era Leaders
The live-ball era, which began in 1920 with alterations to the baseball itself and a shift toward more offensive play, introduced higher run environments that tested pitchers' ability to limit earned runs over extended careers. Career ERA leaders in this era are evaluated for starting pitchers only, requiring a minimum of 1,000 innings pitched since 1920, with ERA calculated exclusively from post-1920 performance to ensure consistency across eras. This criterion highlights endurance and effectiveness in an offensive landscape where league-average ERAs often exceeded 4.00, making sub-3.00 marks particularly elite.[17] Prominent examples include Grover Cleveland "Pete" Alexander, whose post-1920 performance from 1920 to 1930 yielded a 3.09 ERA over 2,437 innings, reflecting his adaptation to the new era despite a pre-1920 dead-ball dominance. Similarly, Lefty Grove posted a 3.06 ERA across his entire 1925–1941 career with the Philadelphia Athletics and Boston Red Sox, leading the American League in ERA nine times. These figures underscore the rarity of sustained excellence amid increased scoring, with only a handful of qualifiers achieving ERAs under 3.10 when accounting for longevity and volume.[62] The following table lists the top 10 starting pitchers by career ERA in the live-ball era (updated through the 2025 season), emphasizing those with predominant starting roles (high games started relative to appearances). Data prioritizes verified game logs for accuracy.| Rank | Pitcher | ERA | Years Active | IP (Post-1920) |
|---|---|---|---|---|
| 1 | Clayton Kershaw | 2.53 | 2008–2025 | 2,855.1 |
| 2 | Jacob deGrom | 2.57 | 2014–2025 | 1,539.2 |
| 3 | Whitey Ford | 2.75 | 1950–1967 | 3,170.1 |
| 4 | Sandy Koufax | 2.76 | 1955–1966 | 2,324.1 |
| 5 | Spud Chandler | 2.84 | 1937–1947 | 1,442.1 |
| 6 | Tom Seaver | 2.86 | 1967–1986 | 4,783.0 |
| 7 | Jim Palmer | 2.86 | 1965–1984 | 3,948.0 |
| 8 | Juan Marichal | 2.89 | 1960–1975 | 3,507.0 |
| 9 | Bob Gibson | 2.91 | 1959–1975 | 3,884.1 |
| 10 | Pedro Martínez | 2.93 | 1992–2009 | 2,827.1 |