Quantum volume
Quantum volume is a metric that quantifies the performance of near-term quantum computers by measuring the largest size of random square circuits—equal in width (number of qubits) and depth (number of gate layers)—that a device can execute with sufficient fidelity to produce the expected output distribution. Introduced by IBM researchers in 2019, it serves as an architecture-neutral benchmark for noisy intermediate-scale quantum (NISQ) devices, capturing the interplay of qubit count, gate error rates, circuit connectivity, and compilation efficiency in a single value, V_Q = 2^k, where k is the maximum integer for which such a circuit succeeds. To compute quantum volume, experiments generate ensembles of random circuits using theQuantumVolume library in Qiskit, consisting of m qubits, d layers of random two-qubit unitaries (with m/2 gates per layer), and single-qubit depth-3 circuits between layers, followed by random permutations to simulate full connectivity. Success is determined by the heavy output probability (HOP), the fraction of output bitstrings with the highest ideal probabilities; a circuit passes if the average HOP exceeds 2/3 over at least 100 trials, with statistical confidence above 97.7% (using a z-score of 2). The value of k is then min(m, d(m)), maximized over possible m, providing a pragmatic measure of usable quantum computation volume despite imperfect hardware.[1]
Since its proposal, quantum volume has become a key industry standard for tracking progress, with IBM's early systems achieving V_Q = 16 in 2019 and subsequent advancements pushing boundaries—such as Quantinuum's H2 system reaching V_Q = 2^{25} = 33,554,432 in September 2025—highlighting improvements in error mitigation and scaling. While it emphasizes practical NISQ capabilities, limitations include reliance on classical simulation for validation and assumptions of balanced width-depth scaling, prompting extensions like volumetric benchmarks for broader testing. This metric underscores the path from current noisy devices toward fault-tolerant quantum computing, influencing hardware design and algorithmic development across major players like IBM and Quantinuum.[2][3]