Turing Award
The A.M. Turing Award, often called the "Nobel Prize of Computing," is the highest distinction in computer science, presented annually by the Association for Computing Machinery (ACM) to individuals for major contributions of lasting and fundamental importance to the computing field.[1] Named in honor of British mathematician Alan M. Turing, who laid the theoretical foundations of modern computing through his work on computability and the Turing machine, the award recognizes groundbreaking advancements in areas such as algorithms, software systems, hardware architecture, and theoretical computer science.[1] Established in 1966, it has honored pioneers whose innovations have profoundly shaped technology and society, with recipients including creators of the internet's foundational protocols, database systems, and artificial intelligence frameworks.[2] The award carries a prize of $1 million, generously funded by Google Inc. since 2014, and is selected through a rigorous nomination and review process by a distinguished committee of ACM Fellows and past laureates, ensuring selections reflect exceptional impact.[1] Eligibility is open to anyone worldwide who has made significant technical contributions, with nominations encouraged from the global computing community to highlight diverse achievements.[3] Over nearly six decades, the Turing Award has celebrated 79 laureates (as of 2025), including recent winners such as Andrew Barto and Richard Sutton in 2024 for foundational work in reinforcement learning, Avi Wigderson in 2023 for insights into the role of randomness in computation, and Robert Metcalfe in 2022 for inventing Ethernet.[4] These selections underscore the award's role in advancing the discipline, fostering innovation, and inspiring future generations of researchers and engineers.[1]Overview
Description
The A.M. Turing Award is an annual prize given by the Association for Computing Machinery (ACM) since 1966 to recognize contributions of lasting and major technical importance to computer science.[1] The award honors groundbreaking work that advances the field, encompassing both theoretical foundations and practical innovations in computing.[3] Administered by the ACM, a professional society founded in 1947 to advance computing as a science and profession, the organization boasts over 100,000 members worldwide from diverse backgrounds in academia, industry, and research.[5] Often dubbed the "Nobel Prize of Computing" for its unparalleled prestige, the Turing Award celebrates achievements that have profoundly shaped modern technology and society.[1] Eligibility for the award is open to any individuals—typically one or more recipients per year—with no requirements regarding ACM membership, nationality, or institutional affiliation.[6] As of November 2025, 79 individuals have received the honor, with the prize conferred annually without interruption since its inception.[4]Purpose and Criteria
The ACM A.M. Turing Award serves as the highest distinction in computer science, recognizing groundbreaking contributions that have fundamentally advanced the field's theoretical foundations, practical systems, and real-world applications, much like the Nobel Prizes honor transformative achievements in other scientific disciplines.[3][1] Established by the Association for Computing Machinery (ACM), the award underscores the profound impact of computing innovations on society and technology.[7] The core criteria for the Turing Award emphasize contributions of "lasting and major technical importance to the computing field," evaluated primarily on their innovation, depth of influence, and broad applicability across diverse domains.[1][6] Qualifying work must demonstrate enduring significance, often through seminal advancements in areas such as theoretical computing—including complexity theory and algorithms—practical systems like databases and computer networks, and interdisciplinary fields such as cryptography and artificial intelligence.[1][4] For example, recipients have been honored for developing foundational concepts in computational complexity that underpin modern algorithm design, or for inventing key networking protocols that enabled the internet's growth.[8][9] Eligibility is restricted to living individuals or collaborative groups announced as recipients, ensuring the award celebrates ongoing contributions to the field.[4] The evaluation process prioritizes evidence of technical merit derived from peer-reviewed publications, verifiable implementations, and sustained scholarly or practical influence, rather than solely on commercial or economic outcomes.[6][10] This focus highlights the award's commitment to honoring intellectual rigor and long-term advancements that shape the evolution of computing.[11]History
Inception
The ACM A.M. Turing Award was first proposed in August 1965 during a meeting of the ACM Council and formally established the following year as the organization's premier honor for technical contributions to computing.[12] Initially envisioned not as a lavish prize but as a distinguished lecture delivered at the annual ACM conference, it quickly evolved into a comprehensive recognition modeled after the Nobel Prize to elevate computer science's status amid its explosive postwar expansion. This growth, fueled by advancements in electronic computing during and after World War II, positioned the field as an independent discipline deserving of equivalent acclaim.[12] Named in honor of Alan M. Turing (1912–1954), the British mathematician whose theoretical innovations laid the groundwork for modern computing, the award commemorates his seminal 1936 paper on computability and the Turing machine—a hypothetical device that formalized the limits of algorithmic processes—as well as his critical role in breaking German Enigma codes during World War II.[1] Turing's work, including explorations of undecidability like the halting problem, provided enduring conceptual foundations that the award seeks to parallel through its recipients' achievements.[1] The inaugural Turing Award was presented in 1966 to Alan Perlis, a pioneering computer scientist at Carnegie Mellon University, for his profound influence on the development of programming languages such as ALGOL and his foundational efforts in computer science education, which helped shape the discipline's pedagogical standards.[1] It was accompanied by a gold medal symbolizing excellence and presented during the annual ACM conference; unlike later years, there was no separate formal ceremony.[13]Evolution
In the 1970s and 1980s, the Turing Award began expanding its recognition beyond individual solo recipients to include collaborative contributions, reflecting the growing interdisciplinary nature of computing research. For instance, Donald Knuth received the award alone in 1974 for his foundational work on algorithms and programming, while the following year, 1975, marked the first joint award to Allen Newell and Herbert A. Simon for their pioneering efforts in artificial intelligence and cognitive simulation.[4][14][15] During the 1990s and 2000s, the award gained greater international prominence, with recipients from diverse global institutions, and saw significant enhancements in its prestige and funding. In 2006, Frances Allen became the first woman to receive the Turing Award, honored for her pioneering contributions to compiler optimization and high-performance computing.[16][17] The prize amount was boosted from $100,000 to $250,000 starting in 2007, supported jointly by Intel and Google, underscoring the award's rising stature in the field.[18] From 2014 onward, the Turing Award underwent further evolution in funding and focus, aligning with the rapid advancements in artificial intelligence and large-scale systems. The prize was elevated to $1 million, fully funded by Google, enabling broader recognition of transformative impacts.[7] This period highlighted a stronger emphasis on AI and systems innovations, such as the 2018 award to Yoshua Bengio, Geoffrey Hinton, and Yann LeCun for conceptual and engineering breakthroughs in deep learning.[19] Over its history, the award has adapted to computing's maturation by shifting from predominantly solo honors in the early years to increasingly recognizing teams, as seen in the 2024 award to Andrew Barto and Richard S. Sutton for foundational work in reinforcement learning.[20] By 2025, a total of 79 individuals had received the award, illustrating its expanded scope.[21] In response to the field's growth, the Turing Award has incorporated more engineering contributions alongside theoretical advancements, honoring practical innovations in areas like systems design and applied algorithms.[1][22]Award Administration
Nomination Process
The nomination process for the ACM A.M. Turing Award is open to any individual, whether an ACM member or not, with self-nominations prohibited but endorsements from peers strongly encouraged to highlight the nominee's contributions.[6] Submissions are handled exclusively through ACM's online nomination portal, with a typical deadline of December 15 each year, end of day Anywhere on Earth (UTC-12).[6] There is no submission fee, and the entire process is maintained as confidential to protect nominees and nominators.[1] Required materials for a nomination include a detailed curriculum vitae of the nominee (including publications, patents, and honors), a suggested citation not exceeding 25 words, a nomination statement of 500–1000 words that argues the nominee's lasting influence and alignment with award criteria, and 4–8 letters of support from recognized experts in the relevant field or subfields.[6] The support letters must specifically address the nominee's technical contributions, their enduring influence on computer science, and qualifications under the award's standards, with endorsers selected to represent diverse perspectives.[6] If needed, the award committee may solicit additional references to supplement the submission.[1] This structure emphasizes accessibility while prioritizing rigorous, peer-validated evidence of transformative work in computing.Selection and Announcement
The selection of Turing Award recipients is managed by a dedicated committee of leading experts in computer science, including former laureates, who review nominations to identify contributions of lasting and major technical importance to the field.[22] This committee, typically comprising eight members drawn from diverse subfields such as theoretical computer science, systems, and artificial intelligence, evaluates submissions each spring, focusing on outstanding technical achievements with long-term influence on computing.[22] The deliberation process emphasizes rigorous assessment of originality, impact, and alignment with the award criteria, often involving consultations with additional specialists to ensure thoroughness and transparency. While the exact voting mechanism—whether consensus or majority—is not publicly detailed, the committee's recommendations are subject to final approval by the ACM Council to maintain institutional oversight.[23] There is no predetermined number of recipients annually, allowing flexibility to honor one or multiple individuals based on merit. Announcements occur annually in early March via an official ACM press release, which details the laureates' achievements and includes a formal citation summarizing their contributions. For instance, the 2024 recipients, Andrew G. Barto and Richard S. Sutton, were announced on March 5 with the citation: "for developing the conceptual and algorithmic foundations of reinforcement learning."[20] This timing aligns with the award's presentation at the ACM Awards Banquet in June, providing months for preparation while building anticipation in the computing community. As of 2025, the core selection procedures remain consistent with established practices, supporting ACM's ongoing commitments to equitable and diverse evaluation.[1]Prize and Ceremony
Financial Aspects
The ACM A.M. Turing Award carries a cash prize of US$1,000,000, funded entirely by Google Inc. since 2014.[1] When the award is given to multiple recipients, the prize is shared equally among them.[1] Over its history, the prize amount has progressively increased to align with the award's rising prestige. In its inaugural years, the prize was modest at approximately US$1,000.[24] By the early 2000s, it stood at US$100,000, with Intel Corporation providing funding starting in 2002.[25] In 2007, Google joined as a co-sponsor, raising the amount to US$250,000 until 2013.[26] The 2014 increase to US$1,000,000, fully supported by Google, elevated the Turing Award to the financial level of globally renowned scientific honors.[7] Beyond the monetary prize, recipients receive reimbursement for travel expenses to attend the annual awards banquet where the honor is presented.[6] Google's ongoing sponsorship highlights the technology sector's commitment to recognizing foundational advances in computing, enabling the Association for Computing Machinery (ACM) to maintain the award's independence and elevate its status among premier scientific accolades.[7]Presentation Event
The Turing Award is formally presented each year at the ACM's annual Awards Banquet, a gala event typically held in June in major cities such as San Francisco or New York.[1] The banquet serves as a key professional gathering for the computing community, bringing together researchers, industry leaders, educators, and dignitaries to celebrate outstanding contributions to the field.[27] Following disruptions from the COVID-19 pandemic, which led to virtual celebrations in 2020, the event has resumed in-person formats with hybrid options available in subsequent years.[28] The ceremony features the ACM President presenting the prestigious medal and citation to the laureates, recognizing their transformative work in computing.[1] Laureates deliver talks, often 20-30 minutes in length, sharing insights into their pioneering research and its impact.[27] The evening includes a formal dinner attended by recipients, their invited guests, sponsors, and ACM leadership, fostering networking and discussions among over 1,000 participants from academia and industry.[27] ACM covers travel and lodging expenses for the winners and their companions to ensure broad participation.[6] Established traditions emphasize honoring technical excellence, with the banquet also recognizing other ACM prizes such as the ACM Prize in Computing and Fellow inductions.[27] Media coverage has expanded since 2014, including video recordings of presentations and talks available for on-demand viewing, enhancing global accessibility.[29] For the 2025 event, held on June 14 at San Francisco's Palace Hotel, the focus was on the 2024 laureates Andrew G. Barto and Richard S. Sutton, whose foundational advancements in reinforcement learning were highlighted through their presentations.[27]Recipients
Complete List
The ACM A.M. Turing Award has been presented annually since its inception in 1966, recognizing contributions of lasting and major technical importance to computer science. As of 2025, 79 individuals have received the award, with multiple laureates honored in many years since 1975 (typically 1–3 per year).[4] Notable milestones include the inaugural award in 1966 to Alan Perlis, the first woman recipient in 2006 to Frances Allen, and the first shared award in cryptography in 2012 to Shafi Goldwasser and Silvio Micali.[4] The following table provides a complete chronological list of recipients, including their nationalities and a brief summary of the award citation.| Year | Laureate(s) | Nationality | Citation Summary |
|---|---|---|---|
| 1966 | Alan J. Perlis | American | Contributions to programming language theory and compiler construction.[4] |
| 1967 | Maurice V. Wilkes | British | Contributions to computer architecture, including the design of the EDSAC.[4] |
| 1968 | Richard W. Hamming | American | Contributions to numerical analysis and error-correcting codes.[4] |
| 1969 | Marvin L. Minsky | American | Contributions to artificial intelligence.[4] |
| 1970 | James H. Wilkinson | British | Contributions to numerical analysis.[4] |
| 1971 | John McCarthy | American | Contributions to artificial intelligence.[4] |
| 1972 | Edsger W. Dijkstra | Dutch | Contributions to structured programming.[4] |
| 1973 | Charles W. Bachman | American | Contributions to database technology.[4] |
| 1974 | Donald E. Knuth | American | Contributions to the analysis of algorithms.[4] |
| 1975 | Allen Newell, Herbert A. Simon | American | Contributions to artificial intelligence.[4] |
| 1976 | Michael O. Rabin, Dana Scott | American, American | Contributions to automata theory and non-deterministic machines.[4] |
| 1977 | John W. Backus | American | Contributions to compiler design and Fortran.[4] |
| 1978 | Robert W. Floyd | American | Contributions to algorithms and programming.[4] |
| 1979 | Kenneth E. Iverson | Canadian | Contributions to programming languages, including APL.[4] |
| 1980 | C. A. R. Hoare | British | Contributions to programming languages and semantics.[4] |
| 1981 | Edgar F. Codd | American | Contributions to database management systems.[4] |
| 1982 | Stephen A. Cook | Canadian | Contributions to complexity theory.[4] |
| 1983 | Ken Thompson, Dennis M. Ritchie | American | Development of generic operating systems and programming languages, including UNIX and C.[4] |
| 1984 | Niklaus E. Wirth | Swiss | Contributions to programming languages.[4] |
| 1985 | Richard M. Karp | American | Contributions to computational complexity.[4] |
| 1986 | John E. Hopcroft, Robert E. Tarjan | American | Contributions to the design and analysis of algorithms.[4] |
| 1987 | John Cocke | American | Contributions to compiler optimization.[4] |
| 1988 | Ivan E. Sutherland | American | Contributions to computer graphics.[4] |
| 1989 | William Kahan | Canadian | Contributions to numerical software.[4] |
| 1990 | Fernando J. Corbató | American | Contributions to time-sharing systems.[4] |
| 1991 | Robin Milner | British | Contributions to computer-aided design and verification.[4] |
| 1992 | Butler W. Lampson | American | Contributions to personal computing.[4] |
| 1993 | Juris Hartmanis, Richard E. Stearns | American | Contributions to computational complexity.[4] |
| 1994 | Edward A. Feigenbaum, Raj Reddy | American | Contributions to artificial intelligence.[4] |
| 1995 | Manuel Blum | American | Contributions to computational complexity theory.[4] |
| 1996 | Amir Pnueli | Israeli | Contributions to temporal logic.[4] |
| 1997 | Douglas C. Engelbart | American | Contributions to human-computer interaction.[4] |
| 1998 | James N. Gray | American | Contributions to database and transaction processing.[4] |
| 1999 | Frederick P. Brooks Jr. | American | Contributions to computer architecture and software engineering.[4] |
| 2000 | Andrew Chi-Chih Yao | Chinese-American | Contributions to computational theory.[4] |
| 2001 | Ole-Johan Dahl, Kristen Nygaard | Norwegian | Contributions to object-oriented programming.[4] |
| 2002 | Ronald L. Rivest, Adi Shamir, Leonard M. Adleman | American, Israeli, American | Contributions to public-key cryptography (RSA).[4] |
| 2003 | Alan C. Kay | American | Contributions to personal computing.[4] |
| 2004 | Vinton G. Cerf, Robert E. Kahn | American | Contributions to internet protocols (TCP/IP).[4] |
| 2005 | Peter Naur | Danish | Contributions to programming languages.[4] |
| 2006 | Frances E. Allen | American | Contributions to compiler optimization (first woman recipient).[4] |
| 2007 | Edmund M. Clarke, E. Allen Emerson, Joseph Sifakis | American, American, French | Contributions to model checking.[4] |
| 2008 | Barbara H. Liskov | American | Contributions to data abstraction and concurrency.[4] |
| 2009 | Charles P. Thacker | American | Contributions to personal computer design.[4] |
| 2010 | Leslie G. Valiant | British-American | Contributions to computational learning theory.[4] |
| 2011 | Judea Pearl | Israeli-American | Contributions to Bayesian networks and causal inference.[4] |
| 2012 | Shafi Goldwasser, Silvio Micali | American, Italian-American | Contributions to cryptography (first shared award in cryptography).[4] |
| 2013 | Leslie Lamport | American | Contributions to distributed systems.[4] |
| 2014 | Michael Stonebraker | American | Contributions to database systems.[4] |
| 2015 | Whitfield Diffie, Martin E. Hellman | American | Contributions to public-key cryptography.[4] |
| 2016 | Tim Berners-Lee | British | Contributions to the World Wide Web.[4] |
| 2017 | John L. Hennessy, David A. Patterson | American | Contributions to computer architecture.[4] |
| 2018 | Yoshua Bengio, Geoffrey E. Hinton, Yann LeCun | Canadian, British-Canadian, French-American | Contributions to deep learning.[4] |
| 2019 | Edwin E. Catmull, Patrick M. Hanrahan | American | Contributions to computer graphics.[4] |
| 2020 | Alfred V. Aho, Jeffrey D. Ullman | American, American | Contributions to compilers and theory.[4] |
| 2021 | Jack J. Dongarra | American | Contributions to numerical algorithms and software.[4] |
| 2022 | Robert Metcalfe | American | Contributions to Ethernet.[4] |
| 2023 | Avi Wigderson | Israeli | Contributions to computational complexity and randomness.[4] |
| 2024 | Andrew G. Barto, Richard S. Sutton | American, Canadian | Foundations of reinforcement learning.[4] |