Fact-checked by Grok 2 weeks ago

Reproducible builds

Reproducible builds are a set of practices designed to ensure that, given the same , build environment, and instructions, any party can recreate bit-by-bit identical copies of specified software artifacts, such as executables or distribution packages. This approach creates an independently verifiable path from human-readable to the used in production, allowing third parties to confirm that distributed binaries match the claimed source without relying on trust in a single distributor. The primary motivation for reproducible builds is to enhance software security and trustworthiness, particularly in open-source ecosystems where binaries are distributed widely and could be tampered with during compilation. By enabling bit-for-bit verification—often through cryptographic hashes—reproducible builds help detect attacks, backdoors, or unauthorized modifications, while also supporting for licensing and export controls. Challenges in achieving reproducibility include non-deterministic factors like timestamps, file orders, or environment variables (e.g., locale settings), which tools like SOURCE_DATE_EPOCH and strip-nondeterminism address by standardizing these elements across builds. Beyond security, the practice improves , resilience against compromised build systems, and efficiency in projects. The concept traces its roots to the early 1990s, when tools first implemented deterministic compilation techniques, though widespread adoption began later. Momentum built in the , spurred by security concerns in and tools: in 2011, Gitian was developed for to enable verifiable builds, and in 2013, the Tor Browser achieved under Mike Perry's leadership. That same year, the project initiated systematic efforts at DebConf13, led by figures like Lunar and Holger Levsen, conducting the first mass rebuild that found 24% of packages reproducible; by 2015, this rose to 80% through tools like .buildinfo files and via jenkins.debian.net. formalized in its policy by 2017, with penalties for non-reproducible changes introduced during 14's development starting in 2025, and a long-term goal of 100% by 2031. 13, released in August 2025, achieved approximately 98% reproducible packages, though full system is ongoing. Today, reproducible builds are adopted across major distributions and projects, including (around 86% reproducible as of 2025), , Tails (fully reproducible ISOs since 2017), , and Guix, as well as build systems like , , and the . The Reproducible Builds organization coordinates global efforts through summits, documentation, and tools like diffoscope for artifact comparison, with ongoing initiatives targeting 100% reproducibility by 2031 in key ecosystems.

Definition and Principles

Definition

Reproducible builds refer to a process in which, given the same , build , and build instructions, any party can recreate bit-by-bit identical copies of all specified output artifacts. This approach ensures that the resulting binaries or packages are verifiable through direct byte-level comparison, often using cryptographically secure hash functions to confirm identity. A key distinction exists between source code reproducibility, which involves compiling the same source code under controlled conditions to produce identical binaries, and full supply chain reproducibility, which encompasses the entire process including fetched dependencies, versions, and environmental factors to achieve the same outcome. The goal is deterministic compilation, where consistent inputs lead to identical outputs without variation from non-deterministic elements like timestamps or hardware differences. Examples of such outputs include compiled executables, distribution packages like Debian's .deb files—where byte-for-byte reproducibility is verified across multiple builds—and container images such as images, which can be made identical to enable independent verification of their contents.

Core Principles

The core principle of in reproducible builds requires that every step of the build process yields identical outputs when provided with the same inputs, thereby eliminating variability from sources such as , timestamps, and environment variables. This ensures that the resulting binaries are bit-for-bit identical across multiple invocations, building on the foundational definition of as achieving exact matches in output artifacts. By enforcing , developers can verify that no unintended modifications occur during , fostering in the . Complementing determinism is the isolation principle, also known as hermeticity, which mandates that builds operate in a self-contained environment unaffected by the host system's state, such as the current date, user identifiers, or external network resources. This hermetic approach prevents leakage of machine-specific details into the output, ensuring that the build process relies solely on explicitly declared dependencies and inputs. As a result, the same can produce consistent results regardless of the build location or timing, enhancing portability and verifiability. To achieve verifiable , reproducible builds emphasize standardized ordering in data structures and the use of cryptographic hashing for output validation. File archives, for instance, must employ deterministic sorting to avoid variations from filesystem traversal orders, such as in tarballs where entries are processed sequentially rather than randomly. Cryptographic hashes, like SHA-256, then serve as a compact representation of the entire build output, allowing independent parties to confirm bit-for-bit identity by recomputing and comparing these digests. This hashing mechanism provides a robust foundation for checks without requiring full redistribution. At a theoretical level, these principles are supported by frameworks like Merkle trees, which model build dependencies as a hierarchical structure of hashes to track and changes efficiently. In this , each represents a hash of its children—encompassing source files, intermediates, and configurations—enabling a single root hash to uniquely identify the entire . This content-addressable approach facilitates high-level verification of build integrity and supports optimizations like incremental recomputation, while maintaining the isolation and determinism of the process.

Importance and Applications

Security and Verification Benefits

Reproducible builds enhance software security by enabling independent verification that distributed binaries precisely match those produced from the publicly available source code, thereby detecting tampering such as malware insertion during compilation or in the supply chain. This mechanism resists attacks where adversaries modify build processes to introduce subtle vulnerabilities, like a single bit flip creating a security hole, as highlighted in analyses of compiler backdoors. By allowing users or third parties to recompile the source and compare outputs, reproducible builds establish a verifiable chain of trust from human-readable code to machine-executable binaries, mitigating risks from compromised build environments. A key benefit is crowdsourced verification, where diverse independent parties rebuild the software and compare resulting artifacts, reducing dependence on any single trusted builder and distributing the verification effort across a community. This approach democratizes security auditing, as participants can confirm the integrity of binaries without needing access to proprietary build pipelines. In practice, tools like Debian's reprotest automate this by constructing packages in varied virtual environments—such as different timezones, user IDs, or locales—and checking for bit-for-bit matches, facilitating widespread adoption in projects like Debian's reproducible builds effort. Cryptographic verification underpins these processes through the use of strong hashes, such as SHA-256, to assert bit-for-bit between independently built binaries, confirming no alterations occurred during . This eliminates the need to cryptographically sign every artifact individually, as the reproducibility itself provides evidentiary proof of integrity, streamlining secure software dissemination while maintaining high assurance against undetected modifications. In high-profile incidents like the 2016 backdoor, where unauthorized code was inserted into ScreenOS firmware to enable VPN traffic decryption, reproducible builds could have aided detection by empowering independent rebuilds and comparisons against the distributed binaries, potentially exposing the compromise earlier. Similar principles apply to modern threats, such as the 2024 backdoor attempt, where altered build scripts introduced malicious code; reproducibility testing across environments would have flagged non-matching outputs, underscoring its role in preempting such attacks.

Broader Applications

Reproducible builds extend beyond verification to enhance workflows, particularly in and isolating issues. By ensuring that identical inputs produce identical outputs, developers can more effectively bisect regressions or build failures, pinpointing the exact changes responsible for discrepancies without interference from environmental variations. For instance, in the ecosystem, tools like cargo-bisect-rustc leverage reproducible compilation to automate the identification of regressions by testing against historical builds, streamlining the process for contributors. This approach not only accelerates issue resolution but also fosters reliable collaboration in large-scale projects. In regulatory contexts, reproducible builds support compliance with standards mandating verifiable software integrity, such as the U.S. (FIPS) 140 series and the European Union's (CRA). Under FIPS, reproducible processes help maintain cryptographic module validation by confirming that binaries align with certified sources, reducing audit burdens and ensuring consistent integrity across deployments. Similarly, the CRA requires manufacturers to demonstrate secure development practices, including reproducible builds to verify that software products with digital elements meet cybersecurity obligations throughout their lifecycle. These capabilities enable organizations to meet legal requirements for transparency and accountability in software supply chains. For containerized and cloud environments, reproducible builds ensure uniformity in image generation, critical for scalable deployments in architectures and / (CI/CD) pipelines. Docker, for example, supports reproducible builds through environment variables like SOURCE_DATE_EPOCH, which standardize timestamps and metadata, allowing teams to produce identical images regardless of build timing or host differences. This consistency mitigates deployment risks, such as runtime inconsistencies in cloud-native applications, and optimizes efficiency by enabling predictable artifact sharing across distributed teams. In open-source distribution, reproducible builds bolster community trust by permitting independent verification of released binaries against , a cornerstone for projects like Fedora's initiative. Fedora's effort aims to make nearly all RPM packages reproducible, allowing users to rebuild and compare outputs to confirm authenticity and detect potential tampering or errors in the distribution process. This practice has been adopted in major distributions, promoting wider adoption and reliability in collaborative ecosystems.

Methods and Techniques

Standardizing Build Environments

Standardizing build environments is essential for reproducible builds, as it enforces the principle of by ensuring that the same inputs produce identical outputs regardless of the host system. This involves creating controlled, consistent setups that encapsulate all necessary tools, libraries, and configurations, minimizing external influences that could introduce variability. Containers and virtual machines provide a primary mechanism for achieving this standardization by encapsulating dependencies and isolating the build process from the host , thereby preventing pollution from pre-installed packages or system configurations. Tools such as allow developers to define a complete build in a Dockerfile, starting from a specified base image and installing only required components, which ensures that builds run in identical conditions across different machines or CI systems. Similarly, Buildah enables the creation of container images without a daemon, facilitating rootless and secure builds that maintain consistency for reproducible outcomes. Virtual machines, while more resource-intensive, offer stronger for complex scenarios where container overhead is insufficient, such as cross-compilation tasks requiring specific kernel versions. Dependency management further reinforces environmental consistency through techniques like pinning exact versions using lockfiles, which record the precise dependency tree to avoid resolution discrepancies over time or across environments. For JavaScript projects, npm's package-lock.json file captures the full dependency graph, including exact versions, sources, and integrity hashes, enabling npm ci to install an identical set of packages reproducibly. In Python ecosystems, requirements.txt files can include version pins and cryptographic hashes (e.g., --hash=sha256:...), allowing pip to verify and install dependencies deterministically, thus ensuring builds are secure and repeatable. Clean slate builds eliminate variability from system packages by initiating the environment from minimal or empty bases, removing any inherited state from the host. In , this is achieved by using lightweight base images like or , which contain only essential components, combined with multi-stage builds to discard temporary artifacts and produce a pristine final image. For Debian-based projects, environments created with pbuilder provide a clean, isolated filesystem populated solely with build dependencies from the target distribution, ensuring no external influences affect the output. A representative example of a standardized is provided by , which uses declarative specifications to define entire build environments reproducibly. In a -based setup, developers create a shell.nix or flake.nix file that explicitly lists inputs such as package versions and sources, with nixpkgs pinned to a specific version for fixed inputs; for instance:
{ pkgs ? import (fetchTarball "https://github.com/NixOS/nixpkgs/archive/21.11.tar.gz") {} }:

pkgs.mkShell {
  buildInputs = with pkgs; [
    gcc
    pkg-config
    openssl
  ];
}
Running nix-shell then instantiates this environment, downloading and configuring dependencies into an isolated sandbox without altering the host system. Subsequent builds with nix-build produce bit-identical results, as Nix hashes all inputs—including the pinned nixpkgs—and enforces purity in the derivation process. This approach allows teams to share environments via version control, facilitating collaboration and verification.

Addressing Non-Deterministic Factors

Non-deterministic factors within the build process, such as timestamps embedded in files or , can introduce variability that prevents identical outputs from the same . Addressing these requires targeted modifications to tools and build configurations to or eliminate such elements, ensuring bit-for-bit . Timestamp normalization is a primary strategy, particularly for tools like and that embed modification times in their outputs. The SOURCE_DATE_EPOCH standardizes this by setting a fixed Unix timestamp—often 1 for builds without meaningful dates—across tools, overriding dynamic values derived from the build time or file . For instance, can be patched or configured to use this variable instead of the current time, while supports options like --mtime to enforce consistent timestamps in archives. Similarly, UID and GID normalization in archives prevents user-specific from affecting outputs; tools like can be invoked with --numeric-owner or post-processed to set fixed values like 0 for , ensuring consistency regardless of the build . Randomness control targets pseudo-random number generators (PRNGs) that may produce varying sequences due to unseeded or entropy-based initialization. The recommended approach is to seed PRNGs with a fixed, deterministic value, such as a constant or derived from SOURCE_DATE_EPOCH, or to disable where possible. Such ensures that any random-like inputs required during or linking yield identical results. Locale and path independence mitigate variations from user-specific settings or build directories. Build systems like Autotools can incorporate flags such as --enable-reproducible-builds to strip locale-dependent information from outputs and rewrite absolute paths to relative ones in debug symbols or binaries. This prevents differences arising from environment variables like or the absolute build path, which compilers might embed in object files. For path handling, tools like have patches to canonicalize source paths, ensuring that the same relative structure is used irrespective of the . For compression and archiving tools, deterministic modes eliminate order- or metadata-induced variability. The utility supports options like --check=crc32 and fixed block sizes to produce identical outputs from the same input, while tools require flags such as -X (omit extra fields) and directory sorting to avoid non-deterministic file ordering or timestamps. Post-build tools like strip-nondeterminism automate normalization for and files by reordering entries and fixing metadata, commonly used in distributions to achieve .

History and Development

Origins and Early Concepts

The concept of reproducible builds emerged from early discussions within the community during the , particularly around achieving deterministic compilation processes. Developers recognized that variations in build environments could lead to differing binary outputs from the same , prompting efforts to standardize compilation for consistency. In project, this idea was put into practice with tools developed in the early to support reproducible outputs across architectures. The philosophical foundations of these technical pursuits were rooted in the open-source movement's emphasis on transparency and verifiability. Stallman's 1985 advocated for the distribution of software to enable users to study, modify, and redistribute it, providing a framework that prioritized user empowerment through accessible and inspectable software artifacts. Formal academic attention to software integrity and verification began to coalesce in the early 2000s, building on prior principles. Preceding organized projects, late-1990s academic research on software focused on cryptographic methods to prove unaltered of . Such approaches underscored the potential for in maintaining software trustworthiness.

Key Milestones and Projects

The Reproducible Builds project emerged in the as a collaborative initiative driven by activists, with its formal launch in 2013 focused on auditing and enhancing the reproducibility of packages. This effort began with initial patches to 's dpkg tool in August 2013, enabling the first reproducible build of the hello package and setting the stage for broader adoption across ecosystems. Early milestones included the development of Gitian in 2011 for to enable verifiable builds, and in 2013, the Browser achieved , influencing 's efforts at DebConf13. A major milestone came with 12 (Bookworm), where the essential and required package sets achieved 100% on the amd64 and arm64 architectures in August 2022, marking a significant verification advancement for the distribution's core components. Efforts have continued, with targeting full in its upcoming 14 release expected in 2025. Other Linux distributions have progressively adopted reproducible builds. initiated partial adoption in 2015 through community-driven efforts, reaching approximately 80-90% reproducibility for its packages by the mid-2020s via tools like embedded .BUILDINFO files in . established a formal goal of 99% reproducible package builds, integrating it as an expectation for maintainers and leveraging infrastructure changes to support independent verification. Beyond distributions, cross-project integrations have amplified the impact. The Tor Project incorporated reproducible builds into its browser bundle starting in 2013, enabling users to independently verify binaries against to counter potential supply-chain compromises. Similarly, Bitcoin Core has employed reproducible builds since the mid-2010s to facilitate wallet verification, allowing anyone to compile identical binaries from the MIT-licensed for enhanced in the software's integrity. Recent developments include the 2024 summit, which fostered collaboration on supply-chain security through workshops and tool development.

Challenges and Solutions

Technical Challenges

One major technical challenge in achieving reproducible builds stems from compiler non-determinism, particularly influenced by operating system features like (ASLR). ASLR randomizes the base addresses of executable segments in memory to enhance security, but this can lead to variability in compiler outputs, such as differing debug information or uninitialized memory reads during compilation in tools like and . For instance, in environments, ASLR may cause inconsistent binary outputs across builds even with identical source code and flags, as the randomized layout affects how the compiler accesses or embeds memory-dependent data. Dependency fetching introduces further variability, especially in ecosystems like Java's , where repositories and mirrors can serve inconsistent content over time. Mirrors may return different versions of the same artifact due to caching, updates, or order, leading to non-reproducible binaries; a documented case involved the commons-collections library fetching version 3.2.1 (vulnerable to a CVE) in one build environment and 3.2.2 (patched) in another, solely due to mirror differences without changes to the pom.xml. This issue is exacerbated by dynamic downloads during builds, where network-dependent fetches from Maven Central or proxies introduce timestamps or metadata variations not controlled by the build script. Multi-platform portability poses significant hurdles due to architectural differences, such as and floating-point precision variations between systems like x86 and . Endianness affects byte ordering in multi-byte data types, including floating-point representations, requiring explicit handling in bit-wise operations to avoid discrepancies in serialized or embedded data across little-endian (e.g., x86) and big-endian (e.g., some configurations) architectures. , governed by , exhibits non-associativity in operations like addition—where (a + b) + c ≠ a + (b + c) due to errors—amplifying differences in sums or transcendental functions across platforms, as libraries may implement varying semantics without . While well-defined IEEE operations yield identical results on x86 and , undefined behaviors (e.g., overflow handling) can diverge, complicating bit-for-bit reproducibility in portable applications. Emerging challenges since 2023 involve integrating AI-generated and (ML) model builds, where non-determinism from training data and processes undermines . In ML pipelines, training involves random weight initialization, data subsampling, and stochastic optimizers like SGD, leading to varying model weights and outputs even with fixed seeds, as hardware differences (e.g., GPU floating-point precision on x86 vs. ) propagate errors through billions of operations. For AI-generated code, large language models (LLMs) introduce variability in outputs due to inherent model non-determinism, making it difficult to pin down exact code artifacts for subsequent builds, particularly in automated pipelines where generated code feeds into . These issues highlight a growing gap in traditional reproducible build practices, as training data and environmental stochasticity resist deterministic capture.

Organizational and Practical Hurdles

Implementing reproducible builds imposes significant resource demands, particularly in terms of computational infrastructure and time required for verification. Projects like rely on extensive () setups, such as the reprotest tool, which performs multiple builds in varied environments (e.g., via SSH, , or ) to detect non-determinism, necessitating substantial hardware and bandwidth for ongoing testing across thousands of packages. This resource intensity often strains smaller teams or open-source initiatives, where funding for dedicated CI grids is limited, leading to slower progress despite motivated contributors. Adoption faces skill and coordination challenges, especially in large-scale or proprietary software environments, where cross-team collaboration is essential but frequently lacking. In proprietary settings, developers must align with upstream vendors and internal stakeholders to patch non-deterministic elements, but poor communication and differing priorities hinder upstream acceptance of fixes, as noted by 11 out of 17 surveyed experts. Businesses in primary and secondary software sectors report technical coordination as a key barrier, with only selective implementation due to the need for specialized expertise in build systems and . Retrofitting legacy codebases for reproducibility presents additional practical hurdles, requiring extensive audits and modifications that disrupt existing workflows. For instance, in the ecosystem, despite ongoing efforts by communities like to enable verifiable builds for open-source apps, broader adoption remains slow due to the complexity of integrating deterministic practices into mature, multi-vendor codebases spanning millions of lines. This issue is compounded in proprietary Android derivatives, where closed-source components resist full verification without vendor cooperation. The absence of standardized metrics further complicates measurement and progress tracking, making it difficult to benchmark adoption across projects. While major Linux distributions have achieved varying coverage—Debian at approximately 95% for its unstable branch, Fedora at around 90%, and Arch Linux at 86% as of late 2025—comparisons are inconsistent due to differing definitions of "reproducible" and testing scopes. These figures highlight incremental gains but underscore the need for unified evaluation frameworks to quantify impact and drive wider implementation.

References

  1. [1]
    Definitions — reproducible-builds.org
    ### Summary of Reproducible Builds Definition
  2. [2]
    Reproducible Builds — a set of software development practices that ...
    Reproducible builds are a set of software development practices that create an independently-verifiable path from source to binary code. (Find out more).Definitions · Tools · News · Docs
  3. [3]
    The history, status, and plans for reproducible builds - LWN.net
    Aug 23, 2024 · Levsen showed a graph of the full history of the continuous-integration (CI) system for reproducible builds of Debian unstable packages. It ...
  4. [4]
    History — reproducible-builds.org
    The idea of reproducible builds is not very new. It was implemented for GNU tools in the early 1990s (which we learned, much later in 2017).
  5. [5]
    Docs — reproducible-builds.org
    ### Summary of Key Terminology and Concepts from https://reproducible-builds.org/docs/
  6. [6]
    ReproducibleBuilds - Debian Wiki
    Oct 26, 2025 · Reproducible builds of Debian as a whole is still not a reality, though individual reproducible builds of packages are possible and being ...
  7. [7]
    System images — reproducible-builds.org
    This documentation's intent is to share what we currently know about making system images build reproducibly: for example, VM and cloud images, live systems, ...<|control11|><|separator|>
  8. [8]
    Why reproducible builds?
    Having reproducible builds means that only changes in source code or build environment (such as the compiler version) will lead to differences in the generated ...Resisting Attacks · Quality Assurance · Dependency Tree Awareness...
  9. [9]
    Commandments of reproducible builds
    Commandments of reproducible builds. Thou shall not record the name of thy maker nor the place of thy making (username, hostname); Thou shall not record the ...
  10. [10]
  11. [11]
    Merkle trees and build systems - LWN.net
    May 28, 2020 · Using Merkle trees as first-class citizens in a build system gives great flexibility and many optimization opportunities.
  12. [12]
    Tools — reproducible-builds.org
    repro is intended to be a tool for users to verify packages distributed by Arch Linux. It uses the embedded .BUILDINFO file to reconstruct an identical build ...
  13. [13]
    Frequently Asked Questions - SLSA
    “Reproducible” means that repeating the build with the same inputs results in bit-for-bit identical output. This property provides many benefits, including ...
  14. [14]
    Researchers Solve Juniper Backdoor Mystery; Signs Point to NSA
    Dec 22, 2015 · Security researchers believe they have finally solved the mystery around how a sophisticated backdoor embedded in Juniper firewalls works.
  15. [15]
    How NixOS and reproducible builds could have detected the xz ...
    How NixOS and reproducible builds could have detected the xz backdoor for the benefit of all ... Let's now try our detection method on the ...
  16. [16]
    Bisecting Rust Compiler Regressions with cargo-bisect-rustc | Inside ...
    Dec 18, 2019 · cargo-bisect-rustc automatically downloads rustc artifacts and tests them against a project you provide until it finds the regression. At ...
  17. [17]
    FIPS 140-2 Explained: The engineer's guide to compliance
    Aug 21, 2025 · Reproducible builds, image hardening, and kernel-independent images keep integrity checks valid across environments, avoiding costly ...
  18. [18]
    EU Cyber Resilience Act Compliance - Sherlocked Security
    May 8, 2025 · Ensure reproducibility of software builds and security configurations. Document Class I/II conformance statements for Notified Bodies. 7 ...
  19. [19]
    Reproducible builds - Docker Docs
    How to create reproducible builds in GitHub Actions using the SOURCE_EPOCH environment variable.
  20. [20]
    Fedora Reproducible Builds
    Fedora reproducible builds aim to allow users to independently verify that packages haven't been tampered with, and that the official build is trustworthy.
  21. [21]
    Developer best practices: Reproducible builds - Internet Computer
    Apr 10, 2025 · It starts from an official Docker image, such that all the installed tools are standard and come from standard sources. This provides the user ...
  22. [22]
    package-lock.json - npm Docs
    Oct 3, 2025 · package-lock.json is automatically generated for any operations where npm modifies either the node_modules tree, or package.json . It describes ...
  23. [23]
    Base images - Docker Docs
    Create a minimal base image using scratch. The reserved, minimal scratch image serves as a starting point for building containers. Using the scratch image ...Missing: slate | Show results with:slate
  24. [24]
    ReproducibleBuilds/Howto - Debian Wiki
    The set of binary packages involved in the build. This includes Essential packages, build-essential, and Build-Depends and Build-Depends-Indep with the ...Older method, deprecated · Do It Yourself · Files in data.tar contain...
  25. [25]
    Timestamps — reproducible-builds.org
    The idea is to either remove the timestamps entirely or to normalize them to a predetermined date and time. strip-nondeterminism was designed as an ...Missing: gzip tar
  26. [26]
    SOURCE_DATE_EPOCH — reproducible-builds.org
    SOURCE_DATE_EPOCH is an environment variable specifying the last modification of source code, measured in seconds since the Unix epoch, for reproducible builds.
  27. [27]
    ReproducibleBuilds/TimestampsInGzipHeaders - Debian Wiki
    Jan 9, 2015 · gzip stores a timestamp by default in its header. This prevents build to be reproducible and capture an uninteresting details.
  28. [28]
    Archive metadata — reproducible-builds.org
    In practice, the tar output is typically stable, but the internal gzip implementation is not. Using git archive --format=tar TAG | gzip -6 -n is ...
  29. [29]
    Randomness — reproducible-builds.org
    If random-like input is required, the solution is to use a predetermined value to seed a pseudo-random number generator. This value can be read from some file, ...Missing: Java SecureRandom
  30. [30]
    Enable reproducible builds with SOURCE_DATE_EPOCH
    Enable reproducible builds when the SOURCE_DATE_EPOCH environment variable is defined. See the following page for details on defining the variable:<|separator|>
  31. [31]
    Build path — reproducible-builds.org
    Some tools will record the path of the source files in their output. Most compilers write the path of the source in the debug information in order to locate ...
  32. [32]
    The GNU Manifesto - GNU Project - Free Software Foundation
    The GNU Manifesto (which appears below) was written by Richard Stallman in 1985 to ask for support in developing the GNU operating system.
  33. [33]
    Success stories — reproducible-builds.org
    In Debian, the essential and required package sets became 100% reproducible in Debian bookworm on the amd64 and arm64 architectures. ... Debian 11 / bullseye. The ...
  34. [34]
    Changes/Package builds are expected to be reproducible
    Sep 4, 2025 · After this Change, package builds are expected to be reproducible. Bugs will be filed against packages when an irreproducibility is detected.Missing: initiative | Show results with:initiative
  35. [35]
    Deterministic Builds Part One: Cyberwar and Global Compromise
    Aug 20, 2013 · This blog post attempts to answer the first question: "Why would anyone want a deterministic build process?" The short answer is: to protect against targeted ...
  36. [36]
    Bitcoin Core :: Download
    Performing the verification steps here ensures that you have not downloaded an unexpected or tampered version of Bitcoin, which may result in loss of funds.
  37. [37]
    Taking the next step: OSS-Fuzz in 2023 - Google Online Security Blog
    an automated code testing ...Missing: reproducible | Show results with:reproducible
  38. [38]
    Hamburg 2024 — reproducible-builds.org
    Hamburg 2024. When: September 17-19 2024. What: Three days to continue the growth of the Reproducible Builds effort. As a follow up to the seven previous ...
  39. [39]
    mitigating non-determinism - reproducible-builds
    Jun 18, 2024 · ... ASLR: Influences from address-space-layout-randomization(ASLR) can be avoided with setarch -R COMMAND or globally with echo 0 > /proc/sys ...<|separator|>
  40. [40]
    ASLR introduces non-determinism · Issue #359 - GitHub
    Jan 21, 2025 · While working on reproducible builds for openSUSE, I found that our intel-graphics-compiler v2.5.6 varies between builds.Missing: GCC Clang
  41. [41]
    The Surprising Pitfalls of Maven Environments and Reproducibility
    May 19, 2025 · This post dives into the surprising ways Maven build environments can introduce non-determinism, using a real-world issue involving commons-collections.
  42. [42]
    JVM — reproducible-builds.org
    Reproducible Central is an effort to rebuild public releases published to Maven Central and check that Reproducible Build can be achieved.
  43. [43]
    [PDF] Designing Bit-Reproducible Portable High-Performance Applications
    If the vendor of the library does not specify its exact floating- point semantics, no assumptions can be made about the reproducibility of the computation.
  44. [44]
    Understand floating-point behavior across x86 and Arm architectures
    This is a topic for developers who are porting applications from x86 to Arm and want to understand floating-point behavior across these architectures. Both ...Missing: reproducible builds multi- endianness
  45. [45]
    Challenges of reproducible AI in biomedical data science
    Jan 10, 2025 · In this study, we examine the challenges of AI reproducibility by analyzing the factors influenced by data, model, and learning complexities.Missing: builds | Show results with:builds
  46. [46]
    Solving Reproducibility Challenges in Deep Learning and LLMs
    Sep 22, 2024 · Reproducibility in deep learning is challenging due to floating-point arithmetic and hardware differences.
  47. [47]
    [PDF] On the Importance and Challenges of Reproducible Builds for ...
    This section provides background information for re- producible builds and their relation to overall open source software supply chain security, as well as ...Missing: coordination proprietary
  48. [48]
    On business adoption and use of reproducible builds for open and ...
    Nov 29, 2022 · This study explores the utility of applying R-Bs in businesses in the primary and secondary software sectors and the business and technical reasons supporting ...
  49. [49]
    Reproducible Builds - Free and Open Source Android App Repository
    To find out if an app can be reproducibly rebuilt by our own buildserver, check the “Reproducibility Status” on any app's page on this website. This can help us ...Missing: 2022 adoption
  50. [50]
    Reproducible Builds: Increasing the Integrity of Software Supply ...
    Apr 13, 2021 · In this paper, we present reproducible builds, an approach that can determine whether generated binaries correspond with their original source code.
  51. [51]
    Overview of various statistics about reproducible builds
    ### Summary of Reproducible Packages in Debian (2025)
  52. [52]
    Arch Linux Reproducible Status
    Arch Linux is 86.2% reproducible with 2081 bad 1 unknown and 12996 good packages. [core] repository is 94.6% reproducible with 15 bad 0 unknown and 261 good ...Missing: coverage 2025