Fact-checked by Grok 2 weeks ago

Patience sorting

Patience sorting is a inspired by the (also known as solitaire), in which a of —typically represented as cards—is processed by greedily placing each onto the leftmost possible decreasing pile, starting a new pile if no suitable placement exists, resulting in a that reveals the length of the (LIS) as the number of piles formed. The algorithm operates by maintaining a collection of piles, where each pile is strictly decreasing from top to bottom, and employs binary search to efficiently locate the appropriate pile for each new element, yielding an overall of O(n log n) for a of n. This efficiency stems from the placement rule, which ensures that the minimal number of piles equals the of the LIS in the input , establishing a direct combinatorial link between the process and dynamic programming problems involving subsequences. Historically, patience sorting was independently discovered in the early by researchers including A.S.C. Ross and Robert Floyd, with early analysis by C.L. Mallows focusing on its probabilistic properties for s. For a of n elements, the expected number of piles (and thus the expected LIS length) is asymptotically approximately 2√n, with refinements showing convergence in probability and bounded variance, which has implications for understanding the typical behavior of permutations in . Beyond , the algorithm optimizes solutions to LIS-related dynamic programming tasks, reducing from O(n²) to O(n n) by leveraging the pile structure to track potential candidates. Notable extensions include its use in manual card sorting—speculated by statistician to be the fastest practical method for humans due to its intuitive nature—and deeper combinatorial studies that model the pile formation as an iterated insertion process akin to the Robinson–Schensted–Knuth correspondence. Patience sorting's elegance lies in its dual role as both a practical sorter and a tool for probing the structure of sequences, making it a staple in algorithm design and probabilistic .

Introduction

Overview

Patience sorting is a comparison-based inspired by and named after the solitaire known as , in which a shuffled is sorted by forming piles according to specific placement rules. The algorithm was invented in the early by linguist A. S. C. Ross as a practical method for cards, and independently discovered by Robert W. Floyd in 1964, and later formalized under the name "patience sorting" by C. L. Mallows. In the algorithm, the input sequence of distinct elements (analogous to card values 1 through n) is processed from left to right. For each element, an attempt is made to place it on the leftmost existing pile where it is strictly smaller than the current top element of that pile; if no such pile exists, the element starts a new pile to the right of all existing piles. Each pile maintains a strictly increasing order from top to bottom, and the top elements across piles are strictly increasing from left to right. Consider the sequence 3, 1, 4, 2, 5 as an example. The first element 3 forms the initial pile with top 3. The next element 1 is smaller than 3, so it is placed on that pile, making the top 1 (pile: 1 over 3). The element 4 is larger than 1, so it starts a new pile with top 4. The element 2 is larger than 1 but smaller than 4, so it is placed on the second pile (pile 2: 2 over 4). Finally, 5 is larger than both 1 and 2, so it starts a third pile with top 5. The resulting piles are [1 over 3], [2 over 4], and . To obtain the fully sorted sequence, the piles—each already increasing from top to bottom—are merged using a multi-way merge process: repeatedly select and remove the smallest current top across all piles, exposing the next element in that pile, until all elements are removed. For the example above, the merge yields the sorted order 1, 2, 3, 4, 5. The number of piles formed also equals the length of the in the input, linking patience sorting to problems.

Key Concepts

Patience sorting operates by processing an input sequence of distinct elements, such as card values from 1 to n, one at a time to form a series of piles arranged in a row. Each pile consists of elements that form a decreasing subsequence relative to their order in the input sequence, with the physical arrangement of the pile having the smallest element (the most recently added) at the top and larger elements below it. This structure ensures that each element in a pile is larger than the one immediately above it in the stack. The core placement rule governs how each new element is assigned: it is positioned on the leftmost pile whose current top element is strictly greater than the new element, allowing the new element to become the new top since it is smaller. If no existing pile satisfies this condition—meaning the new element is larger than all current tops—a new pile is initiated at the right end of the row. This leftmost greedy choice maintains the invariant that the top elements across piles are strictly increasing from left to right at every step. The resulting number of piles quantifies the inherent disorder in the input sequence, interpreted as the minimal number of decreasing subsequences required to partition the entire sequence; fewer piles indicate a more ordered input, while more piles reflect greater "patience" needed to resolve the arrangement. Patience sorting draws inspiration from solitaire card games but diverges as a fully deterministic procedure without probabilistic shuffling or complex winning conditions, focusing instead on systematic pile formation as a foundation for sorting.

Algorithm

Description

Patience sorting operates on an input consisting of an unsorted list of comparable elements, such as integers. The algorithm initializes an empty collection of piles, where each pile is a of elements arranged in decreasing order from bottom to top. For each successive element in the input list, scans the existing piles from left to right (oldest to newest) and identifies the leftmost pile whose top element is strictly greater than the incoming element. The incoming element is then placed on top of that pile, preserving the decreasing order within the pile. If no such pile exists—meaning the incoming element is greater than or equal to the top of every existing pile—a new pile is created at the right end of the collection, consisting solely of the incoming element. This placement rule ensures that the tops of the piles remain in non-decreasing order from left to right. Once all elements have been placed into piles, the algorithm proceeds by reversing the order within each pile to convert it from decreasing to strictly increasing sequences. The resulting increasing sequences are then merged together from left to right using a multi-way merge process, where at each step the smallest top element among the current pile heads is selected and appended to the output list, effectively producing the fully sorted sequence in ascending order. To illustrate, consider the input sequence [5, 3, 4, 1, 2]. The piles evolve as follows:
  • After 5: Pile 1 = (top: 5)
  • After 3 (3 < 5): Pile 1 = [5, 3] (top: 3)
  • After 4 (4 > 3): Pile 1 = [5, 3]; Pile 2 = (top: 4)
  • After 1 (1 < 3): Pile 1 = [5, 3, 1] (top: 1); Pile 2 =
  • After 2 (2 > 1 but 2 < 4): Pile 1 = [5, 3, 1]; Pile 2 = [4, 2] (top: 2)
Reversing the piles yields Pile 1 = [1, 3, 5] and Pile 2 = [2, 4]. Merging these by repeatedly selecting the smallest head produces the sorted output [1, 2, 3, 4, 5].

Pseudocode

The patience sorting algorithm can be expressed in pseudocode as follows, reflecting the process of building decreasing piles and then merging them to produce a sorted output. This representation assumes comparable elements and uses a linear search for pile placement, resulting in O(n²) time complexity in the worst case, though optimizations like binary search on pile tops are possible for the placement step.
function patienceSort(input_list):
    piles = []  // list of lists (or deques) to represent piles
    for each num in input_list:
        placed = false
        for each pile in piles:
            if num < pile.back():  // back() accesses the top (most recent) element
                pile.append(num)
                placed = true
                break
        if not placed:
            new_pile = [num]
            piles.append(new_pile)
    
    // Merge piles by repeatedly extracting the smallest top
    output = []
    while piles is not empty:
        min_top = infinity
        min_pile_index = -1
        for i = 0 to piles.length - 1:
            if piles[i] is not empty and piles[i].back() < min_top:
                min_top = piles[i].back()
                min_pile_index = i
        output.append(min_top)
        piles[min_pile_index].pop_back()  // remove top
        if piles[min_pile_index] is empty:
            remove piles[min_pile_index] from piles
    return output
In this pseudocode, the piles data structure is implemented as a list of deques (or dynamic arrays/lists) to allow efficient O(1) access and modification of the top element via back(), append(), and pop_back() operations. Each pile maintains elements in decreasing order from bottom to top (earliest to latest). For the merge phase, a linear scan identifies the smallest top across non-empty piles, simulating a simple k-way merge without additional data structures. Regarding handling ties, the algorithm uses strict less-than (<) for placement, so equal elements cannot be placed on an existing pile and will start a new pile if they exceed all current tops, preserving the relative order indirectly through the input sequence but not guaranteeing stability. A stable variant can modify the condition to less-than-or-equal (<=), allowing equals to join the leftmost suitable pile and maintain input order for identical elements, though this may alter the number of piles formed.

Analysis

Time and Space Complexity

Patience sorting exhibits a time complexity of O(n log n) in both the average and worst cases when used for sorting a sequence of n elements, primarily due to the optimization of using binary search to determine the appropriate pile for each element. This efficiency arises from the placement phase, where each of the n elements requires a binary search over at most n piles, incurring O(log k) time per insertion with k ≤ n piles, followed by a merging phase that combines the piles in O(n log k) time overall. A related analysis for computing the longest increasing subsequence length establishes a tight bound of n log n - n log log n + O(n) comparisons in the worst case, supporting the O(n log n) characterization for the full sorting process. In contrast, the naive implementation of patience sorting, which relies on a linear scan to identify the suitable pile for each element rather than binary search, results in O(n²) time complexity, as each of the n insertions may require scanning up to n existing piles. This quadratic behavior is inherent to the unoptimized procedure described in the foundational literature on the algorithm. Regarding space complexity, patience sorting requires O(n) additional space to store the piles and associated data structures, since the number of piles is at most n in the worst case, with each pile holding references to elements without duplicating the input array.

Correctness Proof

The correctness of patience sorting rests on the structure of the piles formed during the placement phase and the subsequent multi-way merge step. Each pile constructed is an increasing subsequence from top to bottom, since a card is only placed on the top of an existing pile if it is strictly smaller than the current top card, ensuring that values increase from top to bottom (or decrease from bottom to top). The elements within each pile appear in their original input order, forming a valid increasing subsequence in values when read from top to bottom. A key property maintained throughout the placement process is that the top cards of the piles form an increasing sequence from left to right. This holds by induction on the number of cards processed. For the base case with the first card, there is a single pile, trivially satisfying the property. When processing a new card x, if it is placed on the leftmost eligible pile i (where the top of pile i > x), then for all piles j < i, the tops t_j \leq x (otherwise, placement would occur on an earlier pile, as x < t_j would hold), and for piles j > i, the tops exceed the previous top of pile i, which is greater than x. Thus, after updating the top of pile i to x, the tops remain increasing: t_1 < t_2 < \dots < t_{i-1} < x < t_{i+1} < \dots < t_k. If a new pile is created, x exceeds all existing tops, appending a larger top to the sequence. This ensures that at the end, the top cards t_1 < t_2 < \dots < t_k, where each t_i is the minimum element in pile i. The piles thus form a partition of the input into k increasing sequences (from top to bottom), with increasing minima m_1 < m_2 < \dots < m_k, where m_i = t_i. The multi-way merge of these sequences produces the fully sorted output because the merge operation interleaves elements from the sorted lists while always selecting the smallest available head (current top), and the increasing minima ensure that no element from an earlier pile can be larger than an unpicked element from a later pile in a way that violates global order—due to the greedy placement preserving the partial order. Since the piles partition the input exactly once and each is internally increasing, the merged result contains all elements in non-decreasing order. To establish that the greedy placement rule preserves global order, consider that the leftmost placement minimizes the pile index while maintaining the decreasing condition for extension (smaller than top) and the overall increasing tops property. Any alternative placement would either violate the pile's internal order or contradict the increasing tops invariant, ensuring the final partition allows correct merging.

Applications

In Sorting

Patience sorting serves as an alternative to traditional comparison-based sorting algorithms such as and , offering a guaranteed time complexity of O(n log n) in the worst case through its pile-creation and multi-way merge phases. One key advantage of patience sorting lies in its simplicity of implementation, requiring only basic data structures like lists or queues to manage piles, which makes it particularly intuitive for visualization—much like dealing cards in a solitaire game—and thus valuable for educational purposes in teaching sorting concepts. However, it has notable disadvantages, including lower cache efficiency compared to in-place algorithms like due to its reliance on auxiliary space for piles, and it is inherently unstable, potentially altering the relative order of equal elements unless modified. To address stability, variants of patience sorting track the original indices of elements during pile formation, allowing reconstruction of the sorted order while preserving the input sequence for equal values during the merge step. In practice, optimized implementations of patience sorting, such as PatSort, demonstrate empirical performance comparable to Timsort on random data, achieving speeds around 20% faster than standard quicksort implementations in benchmarks on modern processors.

In Longest Increasing Subsequence

Patience sorting provides an efficient method for computing the length of the longest increasing subsequence (LIS) in a sequence of n distinct numbers, where the number of piles formed at the end equals the LIS length. This connection arises because the greedy placement rule in patience sorting implicitly constructs a structure that tracks the minimal possible endings for increasing subsequences of various lengths, ensuring the pile count reflects the maximum achievable LIS length. In the standard patience sorting process, each pile maintains cards in increasing order from top to bottom, but for LIS computation, the algorithm adapts by focusing solely on the top (smallest) card of each pile, which represents the smallest tail value for all increasing subsequences of that pile's length. This adaptation leverages dynamic programming principles, where piles correspond to subsequence lengths, and new elements are placed by finding the leftmost pile whose top is larger than the element (or starting a new pile if none exists). To achieve O(n \log n) time complexity, binary search is used to locate the appropriate pile for each of the n elements, maintaining an "active list" of the smallest tails across pile lengths. For example, consider the sequence 3, 1, 4, 2, 5:
  • Place 3: one pile with top 3.
  • Place 1: replace top of first pile (now top 1, 3 below).
  • Place 4: cannot place on first pile (1 < 4), so start second pile with top 4.
  • Place 2: cannot place on first (1 < 2), but can on second (4 > 2), so replace top of second pile (now top 2, 4 below).
  • Place 5: cannot place on first (1 < 5) or second (2 < 5), so start third pile with top 5.
This results in three piles, indicating an LIS length of 3, such as the subsequence 1, 2, 5. The correctness relies on the that the top of the k-th pile is the smallest possible ending value among all increasing subsequences of length k encountered so far. By the greedy strategy, no longer subsequence can exist without violating the placement rules, and one can reconstruct an LIS of the computed length by tracing back from pile tops to prior placements, proving the pile count matches the true LIS length.

History and Developments

Invention

Patience sorting originated as a practical method for sorting decks of playing cards manually, drawing inspiration from the solitaire known as "" in . The algorithm was invented by A. S. C. Ross in the early as a procedure for building piles of cards, where each new is placed on the leftmost pile with a top card greater than or equal to it, or starts a new pile if no such pile exists. This pile-building approach mimics the layout rules in certain patience games, allowing for an efficient way to organize shuffled cards without requiring full comparisons across the entire deck. The term "patience sorting" was coined by Colin L. Mallows, who first described and named the method in a problem posed in the SIAM Review in , crediting Ross for its invention. Mallows presented it as a novel technique and solicited solutions for analyzing its properties, particularly the expected number of piles formed from a . The solution, published in 1963, confirmed the method's O(n log n) average-case for via subsequent merging of the piles, establishing its theoretical foundation early on. Independently, computer scientist rediscovered a variant of patience sorting in 1964 while developing an algorithm to compute the length of the in a , recognizing its utility in partitioning sequences into decreasing piles. This computing-oriented perspective highlighted the procedure's role beyond manual , though formal implementations in programming contexts emerged later in the decade through correspondence with . The initial motivation remained rooted in modeling simple, decision-making for physical sorting tasks, such as organizing real decks of cards after shuffling. This connection underscored the algorithm's origins in solitaire analysis from the , bridging manual games to mathematical modeling without altering the core invention.

Subsequent Research

In the , subsequent research on patience sorting deepened its ties to combinatorial probability and patience games. Aldous and Diaconis surveyed the algorithm's connections to the problem, highlighting its role in modeling random permutations and providing probabilistic analyses of pile distributions under uniform shuffling. Their work emphasized how the number of piles follows a Tracy-Widom distribution in the limit, bridging card games with advanced stochastic processes. During the 2000s, optimizations refined patience sorting's use for computations, achieving O(n log n) time through binary search-based pile maintenance. This approach gained practical traction, with implementations leveraging modules like 's bisect for inserting elements into virtual piles while preserving sorted order. Such integrations facilitated efficient finding in software , underscoring the algorithm's balance of simplicity and performance. In the , research extended patience sorting to streaming models, viewing it as a one-pass for approximating lengths with O(log n) space. Gál and Gopalan established communication lower bounds for streaming approximations, showing that deterministic protocols require Ω(n) bits in certain multiparty settings. Saks and Seshadhri developed space-efficient variants achieving O(n^{1/2} polylog n) space for constant-factor approximations, improving upon naive adaptations. Up to 2025, applications have emerged in for sequence modeling, particularly in time series analysis. Techniques for approximating longest increasing subsequences enable efficient computation of distances, aiding tasks like detection and alignment in sequential data. Theoretical extensions have analyzed patience sorting monoids and their combinatorial enumerations via bijective correspondences with tableaux.

References

  1. [1]
    [PDF] Patience - cs.Princeton
    An almost-trivial O(n3/2) sorting algorithm. Speculation. [Persi Diaconis] Patience sorting is the fastest way to sort a pile of cards by hand. 10. Bonus ...
  2. [2]
    [PDF] Patience Sorting, Longest Increasing Subsequences and a ...
    Jul 8, 1993 · The simplest algorithm for computing the length of the longest increas- ing subsequence can be viewed as a card gard, patience sorting, and one ...
  3. [3]
    [PDF] Patience Sorting ( DP Optimisation )
    Patience Sorting is a powerful technique that can transfrom your solutions to Longest Increasing Subsequence type Dynamic Programming problems from O(n^2) toO( ...
  4. [4]
    [math/0506358] Combinatorics of patience sorting piles - arXiv
    The aim of this work is to develop some of the more basic combinatorics of the Patience Sorting Algorithm.
  5. [5]
    [PDF] LONGEST INCREASING SUBSEQUENCES: FROM PATIENCE ...
    Jul 21, 1999 · Abstract. We describe a simple one-person card game, patience sorting. Its analysis leads to a broad circle of ideas linking Young tableaux ...
  6. [6]
  7. [7]
    Patience is a virtue: revisiting merge and sort on modern processors
    Patience sort consists of two phases: the creation of sorted runs, and the merging of these runs. Through a combination of algorithmic and architectural ...Missing: original | Show results with:original
  8. [8]
    Patience Sorting - GeeksforGeeks
    Jul 12, 2025 · Patience sorting is a sorting algorithm based on card game Patience. In this sorting algorithm, the rules of patience game is used to sort an list of elements ...
  9. [9]
    Sorting algorithms/Patience sort - Rosetta Code
    Sort an array of numbers (of any convenient size) into ascending order using Patience sorting. You are encouraged to solve this task according to the task ...
  10. [10]
    Enumerating longest increasing subsequences and patience sorting
    Aug 6, 2025 · In this paper we present three algorithms that solve three combinatorial optimization problems related to each other.
  11. [11]
    Patience Sort Algorithm | Baeldung on Computer Science
    Mar 18, 2024 · Patience sort is an unstable sorting algorithm that consists of two parts: distributing the elements into the sorted piles and merging the sorted piles to get ...
  12. [12]
    [PDF] Patience is a Virtue: Revisiting Merge and Sort on Modern Processors
    We revisit the problem of sorting and merging data in main memory, and show that a long-forgotten technique called Patience Sort can, with some key ...
  13. [13]
    [PDF] COMBINATORICS OF PATIENCE SORTING PILES*
    The term Patience Sorting was introduced in 1962 by C. L. Mallows [15, 16] as the name of a card sorting algorithm invented by A. S. C. Ross. This algorithm ...
  14. [14]
  15. [15]
    bisect — Array bisection algorithm — Python 3.14.0 documentation
    This module provides support for maintaining a list in sorted order without having to sort the list after each insertion.
  16. [16]
    Longest Increasing Subsequence Size (N log N) - GeeksforGeeks
    May 7, 2025 · The task is to find the length of the Longest Increasing Subsequence (LIS) ie, the longest possible subsequence in which the elements of the subsequence are ...
  17. [17]
    Lower Bounds on Streaming Algorithms for Approximating the ...
    Lower Bounds on Streaming Algorithms for Approximating the Length of the Longest Increasing Subsequence. Authors: Anna Gal, Parikshit GopalanAuthors Info ...
  18. [18]
    [PDF] Space efficient streaming algorithms for LIS and asymmetric LCS
    Approximating the length of the longest increasing sequence (LIS) of a data stream is a well- studied problem. There are many algorithms that estimate the size ...<|control11|><|separator|>
  19. [19]
    A Faster Reduction of the Dynamic Time Warping Distance to the ...
    May 11, 2022 · We reveal that the DTW distance can be represented by the longest increasing subsequence (LIS) length of a sequence of integers.
  20. [20]
    [1801.05591] Combinatorics of patience sorting monoids - arXiv
    Jan 17, 2018 · This paper makes a combinatorial study of the two monoids and the two types of tableaux that arise from the two possible generalizations of the Patience ...