Fact-checked by Grok 2 weeks ago

Purely functional programming

Purely functional programming is a that treats all as the of mathematical functions, emphasizing the of programs through the and application of pure functions without mutable state or side effects. In this approach, functions are deterministic, producing identical outputs for identical inputs regardless of execution context, and they avoid interactions with external state such as global variables or operations. This strict adherence to purity distinguishes it from impure functional programming, where side effects may be permitted under controlled conditions. Key principles of purely functional programming include , which allows any expression to be replaced by its evaluated value without changing the program's meaning, enabling easier reasoning and optimization. immutability is fundamental, as values once created cannot be altered, promoting safer concurrency and reducing errors from unintended modifications. Iteration is achieved through rather than mutable loops, and functions are typically higher-order, capable of accepting other functions as arguments or returning them, which facilitates and code reuse. Many implementations incorporate , deferring computation until a value is required, which supports infinite data structures and enhances modularity by decoupling expression evaluation from order of execution. Exemplary languages include , a standardized, lazy, purely functional language developed in the 1990s that enforces purity through its and monadic structures for handling effects; Miranda, an earlier lazy functional language influential in academic settings; and Clean, which pioneered uniqueness typing for efficient I/O in a pure context. These languages demonstrate the paradigm's roots in and its evolution from earlier systems like and , which are functional but impure. The paradigm yields significant advantages, including enhanced modularity through higher-order functions and , which often results in more concise and reusable code compared to imperative alternatives. It simplifies program verification, testing, and by eliminating side-effect-related and enabling formal proofs akin to mathematical derivations. Additionally, the absence of shared mutable state facilitates parallelism and concurrency, making it suitable for modern multicore and distributed systems, while reducing overall development time on complex applications.

Fundamentals

Definition of Purity

In purely functional programming, a is defined as one that, for any given , always produces the same output and has no observable side effects, such as operations, of , or interactions with external environments. This ensures that the function's behavior is deterministic and solely dependent on its arguments, mirroring the properties of mathematical functions. At its core, purely functional programming treats computations as the evaluation of mathematical functions, where programs are constructed as compositions of expressions rather than sequences of imperative commands that alter . This approach emphasizes immutability and the absence of hidden dependencies, allowing programs to be reasoned about equationally, much like algebraic manipulations. The foundational principles of purity trace their origins to the , developed by in the 1930s as a for expressing through function abstraction and application without any mutable state. Church's work provided a theoretical basis for modeling all computable functions in a purely applicative manner, influencing the design of modern purely functional languages. For instance, a simple like exemplifies a : given inputs 2 and 3, it invariably returns 5, with no alteration to external variables or generation of side effects. In contrast, a generator is impure, as it may yield different results for the same input due to reliance on internal or external , violating purity. Purity in functions also enables , where an expression can be substituted with its value without altering the program's meaning.

Referential Transparency

Referential transparency is a fundamental property in purely functional programming, where an expression can be replaced by its evaluated value at any point in the program without altering the overall behavior or meaning. This substitutability ensures that the value of an expression depends solely on its subexpressions and not on external factors such as mutable state or execution context. As defined in early , referential transparency means that to evaluate an expression containing a subexpression, only the value of the subexpression is needed, preserving mathematical-like predictability. Purity in functional programming, characterized by the absence of side effects and deterministic evaluation given the same inputs, directly implies , since expressions remain consistent under substitution. Formally, in the , this is illustrated by the equivalence (\lambda x. x + 1) 5 \equiv 6, where the application can be substituted by its result without affecting the program's semantics, demonstrating how pure functions enable such equational substitutions. The benefits of are profound for program analysis and optimization. It facilitates techniques like , where repeated computations of the same expression are computed only once, improving efficiency without changing results. Moreover, it supports equational reasoning, treating programs as algebraic expressions that can be manipulated and proven correct using mathematical laws, as seen in the algebra of functional programs where combining forms allow transformations like [f, g] \circ h = [f \circ h, g \circ h]. This equational approach contrasts with imperative styles by enabling and optimization at a high level. In practical terms, referential transparency simplifies debugging and testing by ensuring functions produce predictable outputs solely from their inputs, independent of order or , thereby reducing errors from hidden dependencies and allowing modular composition with confidence.

Differences from Impure Functional Programming

Side Effects and Impurity

In purely functional programming, functions produce no observable changes beyond their return values, but side effects introduce modifications to the external that violate this principle. Common types of side effects include (I/O) operations, such as reading from or writing to files or consoles; modifications to global variables, which alter shared across function invocations; throwing exceptions, which abruptly terminate normal execution flow; and non-local mechanisms, like continuations or gotos, that jump execution to arbitrary points in the program. These effects make function behavior dependent on , execution order, or external , contrasting sharply with the deterministic of pure functions. Impure functional programming languages, such as and , permit side effects alongside functional constructs like higher-order functions and , fostering hybrid imperative-functional styles that blend purity with practicality. In , for instance, primitives like rplaca and rplacd enable mutation of list structures, allowing destructive updates within otherwise applicative code. Similarly, integrates functional features with object-oriented elements, supporting mutable variables and I/O directly in functional expressions, which enables efficient real-world applications but compromises strict purity. This allows developers to leverage functional abstractions for expressiveness while incorporating imperative idioms for performance-critical tasks. (Chapter 23 on effects) The introduction of side effects leads to several consequences that undermine the benefits of , including loss of predictability due to non-deterministic outcomes from timing or order dependencies, increased difficulty in because effects can propagate unpredictably across modules, and reduced as become harder to without considering their environmental impact. For example, consider an impure that computes a and prints it to the console:
scala
def impureSum(a: Int, b: Int): Int = {
  val result = a + b
  println(s"Sum is: $result")  // Side effect: I/O
  result
}
This function's behavior changes based on when it is called, potentially interleaving output with other prints, whereas a pure equivalent simply returns the value without external interaction:
scala
def pureSum(a: Int, b: Int): Int = a + b
The pure version maintains referential transparency—replacing calls with their results does not alter program behavior—while the impure one breaks it, complicating reasoning and testing. Side effects thus introduce bugs related to execution order and state interference, which pure functions avoid entirely. To mitigate impurity in such languages, monads serve as a conceptual to encapsulate side effects, structuring computations so that effects like I/O or state changes are isolated within a composable wrapper, preserving much of the pure style's without eliminating outright. This approach, originally developed for pure languages but adaptable to impure ones, sequences effects predictably while allowing controlled interaction with the environment.

Mutable State Management

In impure functional programming, mutable state is managed through mechanisms like reference cells and arrays that allow in-place modifications, enabling imperative-style updates within a largely functional framework. For instance, in languages such as (), reference cells of type 'a ref are created using the ref constructor, dereferenced with the ! operator, and updated via the := assignment operator, permitting variables to change over time unlike immutable values in pure paradigms. Arrays, provided via the Array structure, similarly support mutable operations like Array.update, which alters elements at specific indices without creating new structures. These features extend functional languages with imperative capabilities, as seen in SML's design where such primitives integrate seamlessly with higher-order functions but introduce observable changes to program state. A key challenge of mutable state in impure functional programming is the introduction of , where multiple identifiers can to the same underlying storage location, leading to unexpected interactions during updates. For example, consider a implemented with a shared : val counter = ref 0; fun increment () = (counter := !counter + 1; !counter);, where calling increment() twice yields 1 and then 2, but if another alias val alias = counter is used to set alias := 0 midway, subsequent increments start from the reset value, complicating reasoning about the program's behavior. This breaks , as the meaning of an expression can depend on external mutations rather than solely on its subexpressions, making equational reasoning difficult and increasing the risk of subtle bugs. In concurrent settings, mutable state exacerbates issues like race conditions, where unsynchronized access to shared can produce nondeterministic outcomes, though itself lacks built-in concurrency primitives to mitigate this. To address these challenges while retaining some functional benefits, impure functional languages employ workarounds such as state-passing patterns, where is explicitly threaded through arguments to avoid hidden , or closures that encapsulate mutable references for localized control. In state-passing, a might take the current as input and an updated alongside the result, simulating purity by making dependencies explicit, though this incurs overhead from repeated copying or allocation compared to direct mutation. Closures can capture references in their environment, allowing incremental updates within a scoped context, but they still risk if the captured is shared externally. These techniques aim to localize but often require careful to prevent proliferation of side effects, which encompass broader external interactions beyond internal state changes. The trade-offs of mutable state in impure functional programming favor efficiency for algorithms benefiting from in-place updates, such as certain traversals or accumulators, where avoiding data duplication reduces time and . However, this comes at the cost of reliability, as and loss of hinder , testing, and parallelization, potentially leading to programs that are harder to verify than their immutable counterparts.

Core Properties

Evaluation Strategies

In strictly evaluated purely functional languages, such as , function arguments are evaluated to values prior to , a strategy known as call-by-value. This approach simplifies the by ensuring predictable evaluation order and avoiding redundant computations of arguments used multiple times, unlike call-by-name where arguments may be re-evaluated. Consequently, it can prevent divergence in cases where an argument expression would loop indefinitely if re-evaluated, promoting more straightforward debugging and implementation. In contrast, non-strict or defers the computation of arguments until their values are actually demanded by the function body, typically implemented via call-by-need to share results and avoid recomputation. This strategy, central to , enables the construction and manipulation of infinite data structures, such as lazy lists, without immediate resource exhaustion, as only the required portions are computed . Additionally, it supports natural short-circuiting in control structures like conditionals, where unevaluated branches are discarded if not selected. To illustrate the difference, consider a conditional expression if cond then expr1 else expr2. In a strict evaluator, both expr1 and expr2 are computed regardless of cond's value, potentially wasting effort if, for example, expr2 is an expensive or non-terminating . In a lazy evaluator, only the selected branch is evaluated, mirroring logical short-circuiting and aligning with to facilitate optimizations without altering program meaning. Comparatively, strict evaluation may perform unnecessary work on discarded arguments, reducing efficiency for higher-order functions where arguments might go unused, while excels in compositional settings but can incur higher memory overhead from storage and sharing mechanisms. Referential transparency in purely functional programs further aids lazy strategies by allowing safe reordering and during optimization. Lazy evaluation was popularized in the 1990s through , where it became a defining feature to support modular, with infinite structures and higher-order combinators. Its theoretical foundations trace to , as explored in natural semantics models that separate evaluation from to capture lazy precisely.

Parallelism and Concurrency

In purely functional programming, the absence of shared mutable state significantly simplifies parallelism and concurrency by eliminating race conditions and data races that plague imperative programs. Since all data is immutable and functions are referentially transparent, computations on disjoint parts of the data can proceed independently without overhead, enabling straightforward parallel execution of independent subcomputations. This purity allows developers to focus on algorithmic structure rather than low-level mechanisms. Key mechanisms for achieving parallelism leverage the functional paradigm's recursive and compositional nature. Divide-and-conquer patterns, common in functional recursion, naturally decompose problems into independent subproblems that can be evaluated in parallel; for instance, mergesort can split lists and process halves concurrently, yielding logarithmic depth on parallel architectures. Additionally, integrates well with asynchronous constructs like , where unevaluated expressions (thunks) serve as deferred computations that can be sparked in parallel contexts. A representative example is a parallel map operation over an immutable list, where each element's transformation is executed concurrently using , ensuring deterministic results due to the lack of side effects. Despite these advantages, challenges arise from immutability, particularly the potential overhead of creating new copies for each update, which can increase usage and slow performance in data-intensive tasks. Solutions include that optimize without altering purity, such as Haskell's Strategies library, which modularizes by specifying how to traverse and evaluate structures in while preserving . This approach separates algorithmic logic from execution strategy, mitigating overhead through selective sparking and deep . Research from the highlighted the real-world benefits of these properties, demonstrating that purely functional programs scale more easily on multicore systems compared to imperative counterparts, with parallel extensions achieving near-linear speedups on commodity by exploiting inherent . For example, early implementations in showed effective utilization of multiple cores for divide-and-conquer algorithms, paving the way for broader adoption in scalable computing.

Persistent Data Structures

In purely functional programming, data structures adhere to the principle of immutability, meaning that once created, they cannot be modified in place; instead, any update operation produces a new structure while leaving the original intact. This approach ensures and avoids side effects, allowing values to be safely shared across computations. Persistence in these data structures refers to the ability to maintain multiple versions of the data over time, where unchanged portions are shared between versions to achieve efficiency. A key technique is path-copying, used in tree-based structures, where only the path from the root to the modified node is copied during an update, enabling the rest of the structure to be reused without duplication. For example, in linked lists constructed via cells—pairs consisting of a value and a pointer to the next cell—appending an element creates a new cell that shares the entire of the original list, resulting in O(1) time for sharing and construction. Common persistent data structures include balanced binary search trees, such as AVL trees adapted for sets, which maintain balance to support efficient operations, and tries (prefix trees) for storing and retrieving strings or sequences. These structures typically achieve O(log n) for insertions, deletions, and lookups, where n is the number of elements, by leveraging the logarithmic height of balanced trees and the prefix-sharing properties of tries. The benefits of persistent data structures include inherent thread-safety, as immutability eliminates race conditions without requiring locks, and support for versioning, which allows efficient checkpointing and in computations. Historically, persistent data structures emerged in the and 1980s within research, with early contributions like Dobkin and Lipton's 1976 work on slab partitioning for planar point location and Sarnak and Tarjan's 1986 development of persistent binary search trees using fat nodes for O(log n) operations. Okasaki's advancements, including techniques for real-time queues and catenable lists that reconcile amortization with strict persistence via , built on this foundation to enable more practical implementations.

Languages and Implementations

Notable Purely Functional Languages

, a standardized purely functional programming language, was developed in the late by a of researchers seeking a common platform for non-strict , with the first report published in 1990. It features by default and enforces purity through its and semantics, making it suitable for applications requiring . has found extensive use in academia for teaching and in areas like and program verification, as well as in industry for —exemplified by frameworks like —and financial systems where reliability and parallelism are critical. Miranda, developed by David in the early 1980s, served as a key precursor to , introducing and a clean syntax for purely functional programming in a non-strict semantics. Released commercially in 1985, it influenced the design of subsequent languages by demonstrating practical implementation of higher-order functions and polymorphism without side effects. Miranda found early adoption in academic settings for algorithm development and as a stepping stone toward more comprehensive standards like Haskell. Clean, introduced in the late 1980s by researchers at , is a purely functional that combines strict evaluation with uniqueness typing to manage in a referentially transparent manner, enabling efficient I/O without impurity. Designed for high-performance applications, it has been particularly utilized in and processing, where its graph reduction model supports parallel execution and persistent data structures. In the , purely functional languages transitioned from primarily academic tools to practical implementations, driven by advancements in compilers and libraries that addressed performance concerns, enabling adoption in and beyond. A notable example is , released in 2012 by Evan Czaplicki, which applies purely functional principles to front-end web applications through a domain-specific that compiles to , emphasizing immutability and no runtime exceptions for reliable user interfaces. Another example is , created in 2013 by Phil Freeman, a purely functional language strongly inspired by that compiles to . It features advanced capabilities like row polymorphism and supports effect systems for handling side effects in a pure manner, making it suitable for full-stack with libraries like for UI and backend integration via . As of 2025, PureScript remains actively maintained with a vibrant ecosystem for browser-based applications.

Key Features and Paradigms

In purely functional programming, higher-order functions treat functions as first-class citizens, allowing them to be passed as arguments, returned as results, and composed to form more abstract computations. This enables powerful patterns such as a over a collection to transform each element uniformly or filtering elements based on a predicate , promoting reusable and modular code structures. Strong static type systems with polymorphism form a cornerstone of purely functional languages, inferring types automatically to catch errors at and ensuring program correctness without overhead. The Hindley-Milner , featuring principal type schemes and unification via Algorithm W, supports , allowing generic functions to work across types while guaranteeing . Pattern matching serves as the primary mechanism for , deconstructing data structures to bind variables and select behaviors based on structural shapes, thereby replacing imperative conditionals and loops with declarative expressions. replaces iteration as the standard for repetitive tasks, with tail recursion optimization transforming qualifying recursive calls into efficient loops that prevent and maintain constant space usage. Advanced paradigms extend these foundations for more expressive abstractions. Type classes enable ad-hoc polymorphism by associating operations with types through overloaded instances, allowing flexible definitions of behaviors like equality or ordering without subclassing. Monads abstract over sequenced computations, encapsulating effects such as state or failure in a composable manner while preserving purity through their bind and return operations. Modern extensions, such as linear types introduced in post-2010 research, track resource usage by requiring values to be consumed exactly once, facilitating safe and preventing leaks in performance-critical applications. Despite these strengths, purely functional programming presents challenges, including a steep due to its departure from imperative idioms, requiring developers to rethink and state. Performance tuning also demands expertise, as and higher-order constructs can introduce overhead in or resource-constrained environments, necessitating optimizations like strictness analysis.

References

  1. [1]
    [PDF] Why Functional Programming Matters
    This paper is an attempt to demonstrate to the larger community of (non- functional) programmers the significance of functional programming, and also to help ...
  2. [2]
    Functional Programming HOWTO — Python 3.14.0 documentation
    Functions that have no side effects at all are called purely functional. Avoiding side effects means not using data structures that get updated as a program ...
  3. [3]
    [PDF] Functional Programming Languages (FPL)
    FPL Characteristics: • Functional programming languages are modeled on the concept of mathematical functions, and use only.
  4. [4]
    Functional Programming — ORIE 6125
    A purely functional programming language is one in which there is no mutation. Variable declarations are in fact simply names assigned to some value which can ...
  5. [5]
    [PDF] Functional Programming Languages - Computer Science
    Some functional languages are pure, i.e. they contain no imperative features at all. Examples: Haskell, Miranda, Gofer. Impure languages may have assignment- ...
  6. [6]
    Why Functional Programming Should Be the Future of Software ...
    The adoption of pure functional languages will improve the quality and robustness of the whole software industry while greatly reducing time wasted on bugs.
  7. [7]
    Introduction to Functional Programming
    A pure function is the computational analogue of a mathematical function. The output of a pure function is completely determined by its explicit inputs. If ...
  8. [8]
    Stat 8054 Lecture Notes: R as a Functional Programming Language
    Dec 22, 2023 · A functional programming language is “pure” if all functions act like math functions: for the same inputs (arguments) they always produce the same output ( ...
  9. [9]
    ICS 33 Fall 2025, Notes and Examples: Functional Programming
    We might describe Python as a language based around object-oriented programming, because a Python program is organized around interactions between objects.
  10. [10]
    Functional Programming - Johns Hopkins Computer Science
    In the 1930s Alonso Church developed the lambda-calculus as an alternative to set theory for the foundations of mathematics and Haskell B. Curry developed ...
  11. [11]
    Lambda calculus - Computer Science
    The lambda calculus was invented by Alonzo Church in the 1930s to study the interaction of functional abstraction and function application.
  12. [12]
    [PDF] LABORATORY FOR COMPUTER SCIENCE - DSpace@MIT
    function. An actor which generates random numbers is impure because it returns a random number in response to the same message (ne~t-ranclom-number:). A ...
  13. [13]
    [PDF] Fundamental Concepts in Programming Languages
    Referential transparency has been destroyed, and without it we have lost most of our familiar mathematical tools—for how much of mathematics can survive the ...
  14. [14]
    [PDF] Why Functional Programming Matters 1 Introduction
    Since expressions can be evaluated at any time, one can freely replace variables by their values and vice versa - that is, programs are \referentially.
  15. [15]
    [PDF] Can Programming Be Liberated from the von Neumann Style? A ...
    Associated with the functional style of programming is an algebra of programs whose variables range over programs and whose operations are combining forms. This ...
  16. [16]
    Side Effects - Functional Programming and Software Engineering
    Side effects are operations which do more than return a result: mutate, I/O, exceptions, threads, etc. So far we have not seen many side effects but a few have ...
  17. [17]
    Why is the raising of an exception a side effect? - Stack Overflow
    May 22, 2012 · The function is not pure because throwing an exception terminates the program therefore it has a side effect (your program ends).Using exceptions for flow control - Stack OverflowDifferent styles of flow of program? - java - Stack OverflowMore results from stackoverflow.com
  18. [18]
  19. [19]
    [PDF] Why Functional Programming Matters
    One cannot write a program that is particularly lacking in assignment statements, or particularly referentially transparent. There is no yardstick of program ...
  20. [20]
    [PDF] Monads for functional programming - The University of Edinburgh
    These notes describe one, the use of monads to integrate impure effects into pure functional languages. ... If monads encapsulate effects and lists form a monad, ...
  21. [21]
    [PDF] The Definition of Standard ML
    The Definition of Standard ML was published in 1990. Since then the im ... But for ML, which includes exceptions and assignment, the emphasis has been mainly upon ...
  22. [22]
    [PDF] Haskell 98 Language and Libraries The Revised Report
    The authors and publisher intend this Report to belong to the entire Haskell community, and grant permission to copy and distribute it for any purpose, ...
  23. [23]
    [PDF] A History of Haskell: Being Lazy With Class - Microsoft
    Apr 16, 2007 · Turner showed the elegance of programming with lazy evaluation, and in particular the use of lazy lists to emulate many kinds of behaviours ( ...
  24. [24]
    A natural semantics for lazy evaluation - ACM Digital Library
    A natural semantics for lazy evaluation. Author: John Launchbury.
  25. [25]
    [PDF] Parallelism in Sequential Functional Languages
    This paper formally studies the question of how much paral- lelism is available in cal-by-value functional languages with no parallel extensions.
  26. [26]
    [PDF] Seq no more: Better Strategies for Parallel Haskell - Simon Marlow
    Jun 14, 2010 · We present a complete redesign of Evaluation Strategies, a key ab- straction for specifying pure, deterministic parallelism in Haskell.
  27. [27]
    Seq no more: Better Strategies for Parallel Haskell - ResearchGate
    Aug 7, 2025 · We present a complete redesign of evaluation strategies, a key abstraction for specifying pure, deterministic parallelism in Haskell.
  28. [28]
    [PDF] Purely Functional Data Structures - CMU School of Computer Science
    programming, on persistent data structures, and on programming language design, and by describing some of the open problems related to this thesis. Page 19 ...
  29. [29]
    [PDF] introduction to persistent data structures - Xavier Leroy
    Mar 9, 2023 · These are persistent structures whose implementation uses no imperative features (no in-place updates) and can be written in a pure functional ...
  30. [30]
    Introduction - HaskellWiki - Haskell.org
    Jun 4, 2025 · Haskell is a computer programming language. In particular, it is a polymorphic statically-typed functional programming language with non-strict semantics.
  31. [31]
    (PDF) A history of Haskell: Being lazy with class - ResearchGate
    idea of lazy (or non-strict, or call-by-need) functional languages as. a vehicle for writing serious programs. Lazy evaluation appears to. have been invented ...
  32. [32]
    The Scheme Programming Language
    Scheme is a classic programming language in the Lisp family. It emphasizes functional programming and domain-specific languages but adapts to other styles.Missing: pure | Show results with:pure
  33. [33]
    [PDF] Appendix B FUNCTIONAL PROGRAMMING WITH SCHEME
    Functions are defined as expressions, and repetitive tasks are performed mostly by recursion. Parameters are passed to functions by value. A Lisp program ...
  34. [34]
    [PDF] The History of Standard ML
    Mar 28, 2020 · The paper covers the early history of ML, the subsequent efforts to define a standard ML language, and the development of its major features and ...
  35. [35]
    [PDF] Standard ML
    Standard ML has become remarkably popular in a short time. Universities around the world have adopted it as the first programming language to teach to.
  36. [36]
    [PDF] Programming in Standard ML - CMU School of Computer Science
    13.6 Mutable Arrays. In addition to reference cells, ML also provides mutable arrays as a prim- itive data structure. The type typ array is the type of ...
  37. [37]
    Miranda homepage
    Miranda is a pure, non-strict, polymorphic, higher order functional programming language designed by David Turner in 1983-6.Missing: precursor | Show results with:precursor
  38. [38]
    3.8 Haskell and Miranda
    At the time Haskell was born, by far the most mature and widely used non-strict functional language was Miranda. Miranda was a product of David Turner's company ...Missing: precursor | Show results with:precursor
  39. [39]
    Language features - Clean
    Jul 27, 2010 · Clean is a general purpose, state-of-the-art, pure and lazy functional programming language designed for making real-world applications.
  40. [40]
    [PDF] Functional Programming in CLEAN
    The purpose of this book is to teach practical programming skills using the state-of-the art pure functional language CONCURRENT CLEAN. ... In languages that use ...
  41. [41]
    The history of Functional Programming - Ada Beat
    May 21, 2024 · In this blog post, we will explore the rich history of functional programming, tracing its origins, evolution, and impact on the software industry.
  42. [42]
    Elm - delightful language for reliable web applications
    A delightful language with friendly error messages, great performance, small assets, and no runtime exceptions.Documentation · News · Community · Try Elm!Missing: modern | Show results with:modern
  43. [43]
    How functional programming mattered | National Science Review
    In this paper, we review the impact of functional programming, focusing on how it has changed the way we may construct programs, the way we may verify programs.
  44. [44]
    [PDF] Derivation of a Pattern-Matching Compiler
    This paper presents the derivation of an efficient compiler for pattern-matching in a functional language. The algorithm is described in [A] and [W], ...
  45. [45]
    Proper tail recursion and space efficiency - ACM Digital Library
    This paper offers a formal and implementation-independent definition of proper tail recursion for Scheme.
  46. [46]
    Type classes in Haskell - ACM Digital Library
    This article defines a set of type inference rules for resolving overloading introduced by type classes, as used in the functional programming language ...
  47. [47]
    The best of both worlds: linear functional programming without ...
    When declaring resources such as file handles and manually managed memory as linear arguments, a linear type system can verify that these resources are used ...
  48. [48]
    [PDF] If Functional Programming Is So Great, Why Isn't Everyone Using It ...
    Jul 27, 2017 · Functional programming has been considered to be an answer to many of the problems faced by software engineering today, especially in ...