Fact-checked by Grok 2 weeks ago

Digital artifact

A digital artifact is any undesired or unintended alteration in data introduced during a digital process by an involved . The term is used across various fields, including digital media processing (where artifacts often manifest as visible or audible distortions in images, videos, or audio), preservation (affecting long-term ), and forensics (serving as traces of system or user activity). In digital media, these distortions are commonly caused by limitations or errors in processing techniques, such as algorithms that discard data to reduce . They appear as unintended alterations that degrade quality, becoming more pronounced with higher or in complex scenes like fast motion, representing a between efficient / and faithful . Digital artifacts in media arise from stages like analog-to-digital conversion, sampling, encoding, and decoding. For example, quantization errors approximate continuous signals into discrete values, leading to inaccuracies, while lossy formats like and MPEG remove perceptual redundancies, such as high-frequency details. Other causes include (causing ) or hardware limitations in rendering. Common media artifacts include blocking (grid-like discontinuities), ringing (oscillations around edges), mosquito noise (distortions near high-contrast areas), and blurring (smoothed details) in images/videos, as well as clipping or quantization noise in audio. These are evident in streaming, photography, and . Mitigation involves advanced algorithms like perceptual or . Despite progress in codecs such as and HEVC, artifacts remain a key challenge in digital media standards for , , and .

Overview and Definitions

Core Definition

A artifact encompasses any object or unintended alteration in that emerges from processes such as , , , or transmission. This term applies across various domains in and , where can represent either flaws introduced by technical limitations or meaningful records of activity. Unlike physical artifacts, which exist tangibly and independently, artifacts are inherently encoded representations that rely on specific , software, and formats for rendering and interpretation, making their contingent on evolving technological contexts. The concept manifests in distinct senses depending on the context. In processing, digital artifacts typically denote undesired distortions in signals or content, such as noise or imperfections arising from compression algorithms in images, audio, or video files. These alterations degrade perceptual quality but are intrinsic to efficient data handling in digital systems. In , digital artifacts refer to objects—materials originating in digital form, like electronic documents or datasets—valued for their cultural or informational significance and requiring strategies to mitigate format dependency and . These differ from digitized physical items, as artifacts have no analog precursor and demand ongoing technological intervention to remain interpretable. In , digital artifacts are traces of user or system activity, such as or log entries, that serve as for reconstructing on devices. A formal conceptualization, known as the Curated Forensic Artifact (CuFA), defines them based on properties including curation via forensic procedures, evidentiary value, antecedent temporal relations, and location in a useful format to standardize analysis and reporting. This sense highlights their evidentiary role, distinct from the degradative connotations in media processing. Overall, the dependency on underscores a core property: digital artifacts' meaning and are mediated by interpretive tools, distinguishing them from self-evident physical counterparts.

Historical Development

The concept of artifacts emerged in the amid advancements in and , where unintended distortions in processed data first gained attention as byproducts of early techniques. These artifacts were particularly evident in nascent efforts to reduce data volume for and , laying the groundwork for recognizing digital imperfections as inherent to computational . By the late , research into algorithms, such as those developed by the (), highlighted visible distortions like blocking and ringing as key challenges in lossy encoding. The standard, formalized by ISO in 1992, marked the first widespread documentation of such artifacts, with patents filed as early as 1988 describing their occurrence in discrete cosine transform-based methods. In the , the term expanded beyond technical imaging to encompass broader applications in libraries and forensics, driven by the proliferation of digitized content. Projects like , launched in 1995, exemplified early digitization initiatives that preserved scholarly journals while introducing artifacts from scanning and processes, emphasizing the need for reliable archiving. Concurrently, agencies, including the FBI, began leveraging digital traces in investigations, with formal programs established by 1984 evolving into comprehensive forensic analysis by the mid- to address computer-related crimes. This period saw artifacts not only as errors but as evidentiary markers, such as residual data patterns in seized media. The 2000s witnessed accelerated growth in web archiving and media standards, formalizing digital artifacts within preservation frameworks. The Internet Archive, founded in 1996, began systematically capturing web content, revealing artifacts from crawling inconsistencies and format obsolescence that underscored the fragility of online ephemera. In 2002, the Open Archival Information System (OAIS) reference model was published, later codified as ISO 14721:2003, providing a foundational standard for managing digital objects and mitigating artifactual degradation over time. Video compression standards like MPEG-2, widely adopted in the early 2000s for DVD and broadcasting, further highlighted motion-based artifacts such as macroblocking, influencing media processing norms. From the onward, digital artifacts integrated into emerging technologies like and -generated content, amid rising cyber threats. Bitcoin's launch in 2009 introduced immutable digital traces as artifacts on distributed ledgers, evolving in the with non-fungible tokens (NFTs) on platforms like (2015 onward) that embedded artistic and ownership , creating verifiable yet artifact-prone records. Post-2020, advancements amplified artifacts in , such as deepfakes, prompting heightened focus in data forensics to detect anomalies amid cyber incidents like . Key ISO developments, including extensions to OAIS in subsequent revisions up to ISO 14721:2025, continued to guide preservation against these evolving challenges.

Artifacts in Digital Media and Processing

Types of Media Artifacts

Digital artifacts in media processing refer to unintended distortions or anomalies that arise during the , , or of visual and auditory signals, degrading the of the original content. These artifacts manifest as perceptible irregularities, such as visual patterns or audible distortions, and are distinct from intentional modifications in creative workflows. They commonly occur due to limitations in sampling, quantization, or encoding processes, though detailed causes are explored elsewhere. The primary types of media artifacts include aliasing, compression artifacts, noise, quantization errors, and moiré patterns. Aliasing arises from undersampling high-frequency components, causing them to appear as lower-frequency replicas, often resulting in jagged edges known as "jaggies" in low-resolution images. Compression artifacts emerge from lossy encoding schemes that discard data to reduce file size, with blocking being a prominent example in JPEG images where discrete cosine transform (DCT) divides the image into 8x8 pixel blocks, leading to visible grid-like boundaries at low bitrates. Noise introduces random variations in signal intensity, such as Gaussian noise in scanned or captured images, which follows a normal distribution and stems from sensor or environmental factors, appearing as grainy speckles that obscure fine details. Quantization errors occur during the mapping of continuous analog signals to discrete digital levels, producing banding in smooth gradients where subtle color transitions are replaced by abrupt steps due to insufficient bit depth. Moiré patterns result from interference between repetitive structures in the source material and the sampling grid, creating wavy or checkerboard illusions, as seen in scanned printed images or digital captures of fine textures like fabrics. In image media, artifacts often present as pixelation from excessive downsampling, where individual pixels become visibly blocky, or as ring-like patterns in medical imaging such as MRI scans, known as Gibbs ringing, which appears as oscillating lines near high-contrast edges due to finite Fourier transform truncation. Video artifacts extend these issues temporally, including blocking similar to images but across frames, and motion blur, which smears fast-moving objects when shutter speeds or frame rates fail to capture motion adequately, leading to trailing effects in compressed streams. For audio media, artifacts include clipping, where signal amplitudes exceed the maximum representable value, causing harsh, flattened distortions at peaks, and quantization noise in formats like , which introduces subtle hiss or granularity from coarse amplitude rounding in (MDCT) encoding. These media-specific manifestations highlight how artifacts adapt to the dimensionality and perceptual nature of each format, from static visuals to dynamic soundscapes.

Causes and Examples

Digital artifacts in media processing often arise from sampling limitations, where violations of the Nyquist-Shannon sampling theorem lead to aliasing distortions. When the sampling rate is less than twice the highest frequency component in the signal, high-frequency details fold back into lower frequencies, producing unwanted patterns such as moiré effects in images or audible heterodyning in audio. This is particularly evident in digital imaging, where insufficient pixel resolution captures fine details inadequately, resulting in jagged edges or false textures. Lossy compression algorithms introduce another major source of artifacts through aggressive data reduction techniques, such as the (DCT) in encoding, which divides images into 8x8 blocks and discards high-frequency coefficients. This quantization process creates visible blocking artifacts, where boundaries between blocks appear as grid-like patterns, especially at high compression ratios. imperfections, including in cameras, further contribute by introducing random variations during signal capture; thermal noise and read-out in CMOS sensors manifest as grainy or color speckles, degrading low-light performance. Transmission errors, such as in streaming video over networks, cause frame drops or spatial discontinuities, leading to frozen blocks or temporal jerkiness in the decoded output. At a mechanistic level, quantization during analog-to-digital conversion represents a fundamental source of error, as continuous analog signals are approximated by discrete levels, introducing rounding discrepancies that appear as or . To mitigate the audibility or visibility of these errors, dithering intentionally adds low-level random before quantization, randomizing the and masking it as benign hiss rather than artifacts. A prominent involves in music streaming, which proliferated in the late and early due to its efficient perceptual coding but introduced artifacts like ringing and pre-echo from psychoacoustic filtering and block-based transforms. Pre- s at bitrates below 128 kbps often exhibited noticeable smearing of transients in genres like rock or classical, while post- advancements in streaming platforms shifted toward higher bitrates and hybrid codecs, reducing but not eliminating these issues in bandwidth-constrained environments. In smartphone photography, digital zoom relies on interpolation to upscale cropped sensor data, generating artifacts such as softening, haloing, or unnatural sharpening around edges, as seen in devices like models where AI-enhanced creates fabricated details. For in graphics, occurs when polygons with near-identical depth values compete for coverage, producing flickering or shimmering surfaces, a common issue in scenes with overlapping geometry like architectural visualizations. These artifacts collectively degrade perceptual quality by introducing visible or audible distortions that disrupt immersion and fidelity, often quantified using metrics like (PSNR), which measures the ratio of maximum signal power to noise-induced error but correlates imperfectly with human perception in complex scenes.

Digital Artifacts in Preservation

Role in Digital Preservation

In digital preservation, digital artifacts are defined as born-digital materials—such as emails, PDFs, and 3D models—or items digitized from analog sources that possess both informational content and artifactual value, making them worthy of long-term safeguarding. These artifacts encompass a wide range of cultural and historical records, where their value extends beyond mere data to include contextual elements like creation and structural integrity. The role of digital artifacts in preservation is crucial for maintaining access to , as they represent irreplaceable records of human activity that physical artifacts cannot fully replicate; however, unlike tangible objects, digital ones face unique threats from format obsolescence and technological decay, potentially rendering them inaccessible without intervention. This preservation effort ensures that societal memory—encompassing everything from personal correspondences to institutional archives—remains viable for future generations, mitigating the risk of total loss due to hardware failures or software incompatibilities. A key concept in this domain is digital artifactual value, which refers to the intrinsic properties of these objects, such as file formats, embedded , and rendering behaviors, rather than just their extracted content; for instance, the and hyperlinks in a preserved contribute to its evidential significance. Examples include scanned manuscripts that retain original annotations or web pages captured by the Archive's , launched in 2001 to archive online content and preserve its interactive elements. Preservation processes begin with selection, involving the identification of significant properties—technical characteristics deemed essential for —as outlined in the PREMIS developed in 2005. To sustain access, strategies like migration (converting files to updated formats) or (replicating original software environments) are employed, balancing to artifactual value with practical . Illustrative cases include the Library of Congress's digital collections, where 19th-century photographs have been digitized to preserve visual details and historical context, with ongoing efforts since the to migrate formats and emulate viewing conditions for sustained accessibility. These initiatives highlight how digital artifacts serve as foundational elements in archival strategies, enabling researchers to engage with heritage materials in their intended forms.

Strategies for Long-Term Preservation

Long-term preservation of digital artifacts requires a combination of technical strategies to ensure their authenticity, integrity, and accessibility over extended periods. One core approach is embedding, which involves incorporating descriptive and administrative information directly into digital files to maintain context and . For instance, the standard embeds technical metadata such as capture date and camera settings into image files, aiding in their identification and verification during preservation efforts. Similarly, the Dublin Core Metadata Element Set provides a simple framework for describing documents with elements like title, creator, and format, facilitating long-term management across diverse repositories. Another essential strategy is file format migration, which updates artifacts to more stable, open formats to mitigate obsolescence. This process often involves converting legacy files, such as migrating high-resolution images to , an ISO-standardized subset of PDF designed specifically for archival purposes by restricting features that could compromise long-term readability, such as embedded or external dependencies. serves as a complementary method, recreating the original software and hardware environment on modern systems to render artifacts without altering their content, thereby preserving the authentic of interactive or software-dependent materials. Additionally, trusted digital repositories employ distributed systems like LOCKSS (Lots of Copies Keep Stuff Safe), a network initiated in 2002 that creates multiple redundant copies across institutions to protect against data loss from hardware failure or institutional threats. Key standards underpin these strategies to ensure interoperability and reliability. The Open Archival Information System (OAIS) reference model, formalized as ISO 14721 in 2003 and updated in 2012 (with a further revision in 2025), provides a for archival systems, defining functional entities like , , and to support long-term preservation workflows. Complementing OAIS, the PREMIS (Preservation Metadata: Implementation Strategies) standard, released in 2005 by the , specifies a for capturing technical, , rights, and fixity essential for monitoring and maintaining digital artifacts. Practical tools and organizational practices further enhance preservation efficacy. Checksums, such as hashing, generate unique digital fingerprints to verify file integrity by detecting alterations from bit rot or unauthorized changes, forming a foundational layer of fixity checks in repository workflows. Risk assessment is supported by frameworks like the Trustworthy Repositories Audit and Certification () criteria, developed in 2007 by the Center for Research Libraries and others, which evaluates repositories against 84 measurable attributes across organizational, digital object, and technological dimensions to certify their sustainability. Case studies illustrate the application of these strategies in real-world scenarios. The European (Accessible Registries of Rights Information and Orphan Works towards ) project, running from 2008 to 2011, integrated embedding and repository standards to manage rights information for millions of digitized cultural artifacts, enabling safer mass preservation by identifying orphan works and streamlining clearance processes. However, challenges persist with proprietary formats, such as early documents (.doc), where undocumented features and vendor dependency complicate migration and emulation, often requiring custom tools or format to avoid loss of embedded macros or layout fidelity.

Digital Artifacts in Forensics

Definition and Importance

In digital forensics, artifacts are defined as residual traces of left on a device as a result of user interactions, system events, or attempted deletions, such as temporary files, cache, or log entries that persist even after primary is removed or obscured. These artifacts represent unintentional byproducts of normal device usage, providing investigators with indirect of activities that may not be evident from the surface-level . Unlike the primary —such as documents, images, or emails that constitute the substantive data itself—forensic artifacts primarily consist of , system logs, or ancillary records that indicate what happened rather than what the data says. For instance, while content might show a file's , artifacts like timestamps in file allocation tables or registry entries reveal the sequence, timing, and intent behind its creation or access, offering a more complete of events. The importance of these artifacts lies in their ability to reconstruct event timelines and provide non-repudiable in investigations, particularly cybercrimes, where they surpass content alone by disclosing patterns of behavior, such as repeated unauthorized access or attempts. In involving computers, mobile devices, or networks, artifacts are crucial for establishing evidentiary integrity, adhering to chain-of-custody protocols outlined in NIST guidelines from the , which emphasize to prevent tampering and ensure admissibility in court. The formal recognition of digital artifacts in forensics emerged in the 1990s, coinciding with the development of specialized tools like , launched in 1998, which enabled systematic acquisition and analysis of such traces from seized storage media.

Key Forensic Artifacts and Analysis

In , key artifacts are categorized into system, application, network, and mobile types, each providing distinct evidence of user activity, system events, and device interactions. System artifacts, such as those in the , record core operating system behaviors like executions and connections. For instance, the UserAssist registry under NTUSER.DAT tracks user interactions with applications, including execution counts and last-run timestamps, enabling investigators to reconstruct software usage patterns. Similarly, the USBSTOR subkey in the enumerates connected USB devices by and vendor details, aiding in tracing external media usage. Prefetch files (.pf) in the C:\Windows\Prefetch directory cache execution data for frequently used s, revealing run counts, timestamps, and loaded DLLs to assess frequency and timing. Application artifacts capture user interactions within software, often stored in structured databases. Browser history, for example, is maintained in SQLite databases like Chrome's History file, which logs URLs, visit timestamps, and download paths, allowing recovery of web activity even from deleted entries through unallocated space analysis. These artifacts help correlate online behavior with other evidence, such as timestamps aligning with system logs. Network artifacts involve captured traffic data, primarily in (Packet Capture) files, which record full packet details including source/destination IPs, protocols, and payloads for reconstructing communications. Analysis of PCAPs can reveal command-and-control channels or , with tools dissecting encrypted sessions via like TLS handshakes. Mobile artifacts, particularly on , include plist (property list) files that store configuration and usage data. Significant location data in the com.apple.routined cache directory, for instance, retains geolocation history from apps and services, including / coordinates and timestamps, useful for mapping user movements. The Amcache.hve file, introduced in Windows 8 in 2012 and located at C:\Windows\AppCompat\Programs\Amcache.hve, further exemplifies system artifacts by logging application installations, executions, and file paths, providing evidence of software deployment even after uninstallation. Analysis techniques focus on extracting and correlating these artifacts for investigative insights. Timeline reconstruction integrates timestamps from multiple sources into a chronological narrative, often using the open-source Plaso tool, which parses logs, registries, and files to generate super timelines for event sequencing. Hashing with SHA-256 ensures evidence integrity by generating unique digests of files or images; any alteration produces a mismatched hash, verifying chain-of-custody. File carving recovers deleted or fragmented data by scanning unallocated disk space for known file signatures (e.g., JPEG headers), bypassing file system metadata to retrieve artifacts like images or documents. Specialized tools facilitate these analyses. , an open-source platform based on The Sleuth Kit, automates ingestion of disk images, timeline generation, and carving across categories, supporting keyword searches and artifact visualization. , a framework for memory forensics, extracts volatile artifacts like running processes and network connections from dumps, crucial for detecting in-memory . Challenges in artifact analysis include anti-forensic techniques like data wiping, where tools overwrite files with patterns (e.g., zeros or random data) to hinder recovery, necessitating advanced and residual analysis to detect remnants.

Challenges and Future Directions

Common Challenges

Handling digital artifacts presents several technical challenges that threaten their integrity and usability over time. , the gradual corruption of data due to hardware failures or environmental factors, can silently alter files without detection, complicating long-term preservation efforts in and other collections. obsolescence further exacerbates this issue, as older file types become unreadable when supporting software or hardware is discontinued; for instance, files created in during the 1980s are often inaccessible today without specialized conversion tools. poses another hurdle, particularly for repositories managing vast collections, where processing and verifying millions of files strains computational resources and requires robust architectures to prevent widespread . Ethical and legal concerns add complexity to artifact management across domains. In , privacy protections under regulations like the EU's (GDPR), effective since May 2018, demand careful handling of personal data extracted from devices to avoid unauthorized disclosure during investigations. Similarly, ownership disputes arise in the preservation of cultural artifacts, where of or communal raises questions of rights and , potentially leading to cultural appropriation if communities lack control over digital reproductions. Cross-context problems hinder effective artifact stewardship on a broader scale. Interoperability issues stem from inconsistent metadata standards across institutions, making it difficult to share or migrate artifacts between systems without loss of contextual information. Resource intensity compounds this, as storing petabyte-scale archives incurs escalating costs for hardware, energy, and maintenance, often outpacing institutional budgets and risking selective preservation. Real-world incidents underscore these vulnerabilities. The 2017 disrupted global systems, including healthcare networks, revealing forensic recovery challenges such as encrypted artifact inaccessibility and the need for rapid, cross-jurisdictional analysis to reconstruct compromised evidence. Likewise, Yahoo's 2013 , affecting all 3 billion accounts, highlighted preservation risks by exposing unencrypted user data to long-term theft and degradation, eroding trust in digital repositories. Advancements in (AI) and are transforming the detection and mitigation of digital artifacts in media. Since 2017, generative adversarial networks (GANs) have enabled automated techniques to repair artifacts such as compression distortions in images, with seminal work demonstrating semantic using deep generative models. Recent approaches further automate the identification of visual artifacts in compressed images, including texture degradation and color shifts, achieving high accuracy in distinguishing AI-compressed from traditional formats. In , AI-assisted tools now enhance timeline by analyzing artifacts like log files and network traces, using generative AI to attribute anomalies and build coherent event sequences from fragmented data. Blockchain technologies are increasingly integrated for the immutable preservation of digital artifacts, ensuring and integrity over time. The Ordinals protocol, introduced on in 2023, allows inscriptions of digital artifacts directly onto satoshis, creating unique, non-fungible representations akin to NFTs while leveraging 's security for long-term storage. Non-fungible tokens (NFTs) serve as verifiable proofs of ownership for digital artifacts, providing blockchain-based certificates that extend to both purely digital and hybrid assets, with applications in and authentication. Emerging formats are expanding the scope of digital artifacts beyond traditional 2D media into immersive and sustainable domains. Post-2020 advancements in have facilitated high-fidelity of sites, generating (VR) models that preserve artifacts as interactive digital twins for research and public access. Research on DNA-based , prominent in the , offers ultra-dense, long-term archival solutions for digital artifacts, with densities exceeding petabytes per gram and stability spanning thousands of years, as demonstrated in early encoding experiments. Interdisciplinary applications are bridging digital artifacts with fields like humanities and computing security. In , scholars analyze artifacts within archives to trace cultural narratives, employing web-archiving techniques to capture ephemeral online content for historical study. Quantum computing poses emerging threats to the securing digital artifacts, potentially decrypting protected cultural data; however, standards are being developed to safeguard preservation efforts. Projections indicate significant growth in hybrid physical-digital artifacts by 2030, driven by (AR) integrations that overlay digital enhancements on tangible objects, such as museum exhibits, with the global AR market expected to reach $599.59 billion and enabling widespread adoption in heritage and education.

References

  1. [1]
    Artifact Definition - TechTerms.com
    Dec 17, 2022 · An artifact, or compression artifact, is a small distortion in a digital image, video, or audio file caused by a lossy compression algorithm.
  2. [2]
    Definition of artifact - PCMag
    Artifacts are a natural byproduct of digital compression methods such as JPEG and MPEG, which permanently discard pixels. The greater the compression used, the ...
  3. [3]
    Coding Artifact - an overview | ScienceDirect Topics
    Coding artifacts refer to visible distortions that occur in digital images or videos as a result of lossy compression techniques, particularly when encoding ...Wavelet Denoising For Image... · Embedded Video Codecs · 11.3. 2 Decoder Structures...
  4. [4]
    Digital Artifact - an overview | ScienceDirect Topics
    Digital artifacts are collections of related data generated through the regular use of digital devices or systems, whether intentionally or unintentionally. 1 2
  5. [5]
    Digital Artifact - Artifact Details | MITRE D3FEND™
    name: Digital Artifact; definition: An information-bearing artifact (object) that is, or is encoded to be used with, a digital computer system.
  6. [6]
    Compression Artifacts - Cloudinary
    Sep 6, 2025 · Compression artifacts are distortions or imperfections in digital media, such as images, videos, and audio, that occur after the data is compressed to reduce ...
  7. [7]
    Artifact (error) - Wikipedia
    , in computer graphics, distortion of media by the data compression. Digital artifact, any undesired alteration in data introduced during its digital processing ...<|control11|><|separator|>
  8. [8]
    All Digital Objects are Born Digital Objects | The Signal
    May 15, 2012 · A digitized object exists to record and present characteristics of some physical object. In contrast, born digital objects began their existence as digital.
  9. [9]
    Demystifying Born Digital - OCLC
    "Born digital" refers to current and future information existing only in digital form, including data sets, websites, digital manuscripts, and photographs.
  10. [10]
    CuFA: A more formal definition for digital forensic artifacts
    Aug 7, 2016 · The term “artifact” currently does not have a formal definition within the domain of cyber/digital forensics, resulting in a lack of ...
  11. [11]
    Glossary - Digital Preservation Handbook
    Digital Preservation Refers to the series of managed activities necessary to ensure continued access to digital materials for as long as necessary. Digital ...
  12. [12]
    For an Archeology of the Digital Iconography - MDPI
    Nov 27, 2017 · ... digital artifact—to many conferences concerning this research (Figure 9). ... After this analysis on the origins of digital image processing ...
  13. [13]
    [PDF] The JPEG Still Picture Compression Standard
    This article gives an overview of JPEG's proposed image-compression standard. Readers without prior knowledge of JPEG or compression based on the. Discrete ...
  14. [14]
    JPEG JFIF - W3C
    Feb 13, 1996 · The JPEG compression format was standardised by ISO in August 1990 and commercial applications using it began to show up in 1991. The widely ...
  15. [15]
    30 years of JSTOR: How a library shelf crisis sparked a global archive
    Apr 3, 2025 · In 1995, JSTOR launched with a mission that felt radical at the time: digitize scholarly journals and make them accessible online to researchers and educators ...
  16. [16]
    [PDF] An Historical Perspective of Digital Evidence: A Forensic Scientist's ...
    As early as 1984, the FBI Laboratory and other law enforcement agencies began developing programs to examine computer evidence. To properly address the growing ...Missing: traces | Show results with:traces
  17. [17]
    About IA - Internet Archive
    Dec 31, 2014 · We began in 1996 by archiving the Internet itself, a medium that was just beginning to grow in use. Like newspapers, the content published on ...
  18. [18]
    ISO 14721:2003 - Open archival information system
    This reference model addresses a full range of archival information preservation functions including ingest, archival storage, data management, access, and ...
  19. [19]
    [PDF] 20 Years of Progress in Video Compression – from MPEG-1 to ...
    This paper is an attempt to show this history of development. It highlights the history of individual algorithms of data encoding as well as the evolution of ...
  20. [20]
    Cultural heritage preservation by using blockchain technologies
    Jan 10, 2022 · In principle, a heritage artifact will be incorporated in the blockchain only once, while in ordinary ledgers every artifact (like Bitcoin) will ...
  21. [21]
    Unmasking digital deceptions: An integrative review of deepfake ...
    Sep 18, 2025 · Deepfakes, which are driven by developments in generative AI, seriously jeopardize public trust, cybersecurity, and the veracity of information.
  22. [22]
    Standards and best practice - Digital Preservation Handbook
    This can include identifying and quantifying the materials to be transferred, assessing the costs of preserving them and identifying the requirements for future ...
  23. [23]
    Artifacts in digital images - NASA Technical Reports Server (NTRS)
    Artifacts in digital images include aliasing from undersampling, interference from improper display, and harmonic overtones from quantization. Undersampling ...
  24. [24]
    [PDF] JPEG Artifacts Removal via Compression Quality Ranker-Guided ...
    Since JPEG compression applies DCT on each block, the cor- relation between adjacent blocks is ignored, which introduces blocking artifacts. Meanwhile ...
  25. [25]
    [PDF] NOISE MODELS IN DIGITAL IMAGE PROCESSING - arXiv
    Gaussian noise caused by natural sources such as thermal vibration of atoms and discrete nature of radiation of warm objects [5]. Gaussian noise generally ...
  26. [26]
    [PDF] A visual model for predicting chromatic banding artifacts
    Quantization of images containing low texture regions, such as sky, water or skin, can produce banding artifacts. As the bit- depth of each color channel is ...
  27. [27]
    Digital Radiography Image Artifacts | Radiology | SUNY Upstate
    Artifacts due to "aliasing" arise as a result of insufficient sampling of high frequency digital signals in an image represented by sharp edges or periodic ...
  28. [28]
    Gibbs and truncation artifacts | Radiology Reference Article
    Gibbs artifact, also known as truncation artifact or ringing artifact, is a type of MRI artifact. It refers to a series of lines in the MR image parallel to ...
  29. [29]
    Understanding Video Compression Artifacts - Component
    Feb 16, 2017 · Artifacts are first categorized by whether they're time/sequence-based (temporal) or location-based (spatial).
  30. [30]
  31. [31]
    Nyquist frequency, Aliasing, and Color Moire - Imatest
    Any information above the Nyquist frequency that reaches the sensor will be “aliased” to a lower spatial frequency, which can result in the artifacts described ...
  32. [32]
    Nyquist Sampling Theorem - GeeksforGeeks
    Jul 23, 2025 · Aliasing occurs when the high frequency parts of the signal occurs in the lower frequency, causing distortion. Reconstruction of Signals: The ...
  33. [33]
    Aliasing Filter - an overview | ScienceDirect Topics
    1 2 In digital signal processing, aliasing occurs when measurable frequency content exists above one-half the sampling rate, causing energy or power in a ...
  34. [34]
    What is Aliasing and How Does it Impact Digital Signal Processing?
    Jun 27, 2025 · Aliasing can be caused by several factors, primarily related to insufficient sampling rates or poor system design. 1. Inadequate Sampling Rate: ...
  35. [35]
    Sampling rate and aliasing effect: signal processing explained - Kistler
    The aliasing effect is a measurement error in the signal occurring due to an incorrectly set sampling rate. If the sampling rate is too low, the Nyquist-Shannon ...
  36. [36]
    JPEG Image Compression - Interactive Tutorial
    Feb 12, 2016 · At the highest compression ratios, 8 x 8 blocking artifacts occur, which mask many of the image features. The consequences of high compression ...
  37. [37]
    JPEG block artifacts. The red dotted lines highlight the boundaries of...
    quantization of DCT coefficients (lossy compression) leaves traces at the boundaries of each 8 × 8 block, as shown in Figure 1. These traces, characteristic ...
  38. [38]
    Dealing With Noise In Image Sensors - Semiconductor Engineering
    Feb 1, 2024 · Noise typically results in grainy images, often associated with poor lighting, the speed at which an image is captured, or a faulty sensor.
  39. [39]
    Camera Noise - an overview | ScienceDirect Topics
    Temporal noise is a common artifact occurring in digital video sequences. Noise in digital imaging is mostly due to the camera and is particularly noticeable ...
  40. [40]
    Impact of Packet Loss Rate on Quality of Compressed High ... - NIH
    Mar 2, 2023 · This paper analyzes the adverse impact of packet loss on video quality encoded with various combinations of compression parameters and resolutions.
  41. [41]
    [PDF] Visibility of individual packet loss on H.264 encoded video stream
    This paper presents a study investigating the impact of individual packet loss on four types of H.264 main-profile encoded video streams. Four artifact factors ...
  42. [42]
  43. [43]
    How to Address Quantization Errors in Analog-to-Digital Conversion?
    Jun 27, 2025 · Quantization error occurs during ADC when an analog input voltage is rounded to the nearest available digital value. This error is inevitable ...
  44. [44]
  45. [45]
    What is Dithering? Using Dithering to Eliminate Quantization Distortion
    Dec 4, 2022 · Learn how dithering can be added to a signal to improve the performance of an analog-to-digital conversion system by eliminating quantization error and ...
  46. [46]
    Dynamic Range, Dithering and Noise Shaping - AudioCheck.net
    Dithering is the process of intentionally adding noise to the signal prior to its quantization. Dithering turns quantization distortion all the way down.
  47. [47]
    See Better and Further with Super Res Zoom on the Pixel 3
    Oct 15, 2018 · The Super Res Zoom technology in Pixel 3 is different and better than any previous digital zoom technique based on upscaling a crop of a single image.
  48. [48]
    Google Pixel 10 Pro's AI Zoom Sparks Debate on Photo Authenticity
    Sep 4, 2025 · Industry experts have dissected sample images from the device, revealing how AI interpolation can create artifacts or hallucinations—imagined ...
  49. [49]
    Reverse Z in 3D graphics (and why it's so awesome) - Hacker News
    Jun 4, 2024 · One thing that can go wrong in 3d graphics is z-fighting, where a scene has rendering artifacts because two objects are touching or intersect each other.
  50. [50]
    Making Sense of PSNR, SSIM, VMAF - Visionular
    Jul 8, 2024 · In this article, we talk about PSNR, SSIM, and VMAF – how they work, their drawbacks, and where they are used in the video compression and streaming industry.
  51. [51]
    Perceptual visual quality metrics: A survey - ScienceDirect.com
    In this paper, we give a systematic, comprehensive and up-to-date review of perceptual visual quality metrics (PVQMs) to predict picture quality according to ...
  52. [52]
    Preserving Authentic Digital Information • CLIR
    The relationship between digital preservation and authenticity stems from the fact that meaningful preservation implies the usability of that which is preserved ...
  53. [53]
    [PDF] PREMIS Final Report - Library of Congress
    Significant properties may be objective technical characteristics subjectively considered important, or subjectively determined characteristics. For example ...
  54. [54]
    Thirteen Ways of Looking at...Digital Preservation - D-Lib Magazine
    Implementation of preservation measures should be as transparent as possible to users of digital materials, and should not represent obstacles to access and use ...
  55. [55]
    artifactual value - SAA Dictionary
    artifactual value. n. the usefulness or significance of an object based on its physical or aesthetic characteristics, rather than its intellectual content
  56. [56]
    Unveiling the Wayback Machine's Vital Role in Investigative Work
    Jul 10, 2023 · The Wayback Machine has been particularly useful in finding and retrieving lost websites, said Ranca. She also makes sure materials she produces are preserved ...
  57. [57]
    Preservation action - Digital Preservation Handbook
    Emulation offers an alternative solution to migration that allows archives to preserve and deliver access to users directly from original files. This technique ...
  58. [58]
    Free to Use and Reuse Sets - The Library of Congress
    The digital collections comprise millions of items including books, newspapers, manuscripts, prints and photos, maps, musical scores, films, sound recordings ...Your Favorites from the Library... · Public Domain Films · Advertising Food · Cats
  59. [59]
    Digital Preservation at the Library of Congress
    Digital preservation efforts are distributed throughout many units at the Library of Congress and includes programs related to digital content packaging and ...
  60. [60]
    Exif Exchangeable Image File Format, Version 2.2
    Apr 9, 2024 · The Exif metadata, primarily technical metadata associated with camera settings, is shared by the two file types and represents an extension ...
  61. [61]
    DCMI: Dublin Core™ Metadata Element Set, Version 1.1: Reference ...
    "The Dublin Core", also known as the Dublin Core Metadata Element Set, is a set of fifteen "core" elements (properties) for describing resources. This fifteen- ...
  62. [62]
    PDF/A Family, PDF for Long-term Preservation
    May 9, 2024 · PDF/A is a family of ISO standards for constrained PDF forms intended for long-term preservation of page-oriented documents.Identification and description · Local use · Sustainability factors · File type signifiers
  63. [63]
    Emulation as a Digital Preservation Strategy - D-Lib Magazine
    Migration is the process of transferring data from a platform that is in danger of becoming obsolete to a current platform. This process has both dangers and ...
  64. [64]
    [PDF] LOCKSS: A Distributed Digital Archiving System Table of Contents
    Oct 8, 2002 · The LOCKSS model, which is based on analysis of cultural continuity epitomized by. "Lots of Copies Keeps Stuff Safe," creates low-cost, ...
  65. [65]
    Open archival information system (OAIS) - ISO 14721:2012
    ISO 14721:2012 defines the reference model for an open archival information system (OAIS). An OAIS is an archive, consisting of an organization, which may be ...Missing: source | Show results with:source
  66. [66]
    PREMIS: Preservation Metadata Maintenance Activity (Library of ...
    The PREMIS Data Dictionary for Preservation Metadata is the international standard for metadata to support the preservation of digital objects and ensure ...Missing: 2005 | Show results with:2005
  67. [67]
    [PDF] tory CRL specifications certification criteria RLG Programs
    This new document, version 1.0 of the Criteria for Measuring Trustworthiness of Digital Repositories &. Archives: an Audit & Certification Checklist, represents ...
  68. [68]
    ARROW: Accessible Registries of Rights Information and Orphan ...
    The Accessible Registries of Rights Information and Orphan Works Towards Europeana (ARROW) project began in September 2008 with the aim of facilitating the ...
  69. [69]
    Best Practices for the Selection of Electronic File Formats
    Some formats, such as Microsoft Word documents (.doc), are proprietary; but due to their widespread popularity, they are relatively safe for preservation.
  70. [70]
    Forensic Artifact - an overview | ScienceDirect Topics
    Digital forensic artifacts can be defined as trace information left on a device, whether intentionally or unintentionally, through the regular use of that ...Introduction to Forensic... · Forensic Process: Collection...
  71. [71]
    Digital Forensics: Content vs. Artifacts - What's the Difference?
    Feb 17, 2017 · Digital forensic artifacts are better indicators of what actually transpired and reveal more things than content ever will.<|control11|><|separator|>
  72. [72]
    Computer Artifacts: Top artifacts investigators need - Magnet Forensics
    Jun 26, 2024 · Artifacts of execution, attribution and deletion are key parts of digital forensic examination to trace and interpret user activities on digital ...
  73. [73]
    What is an Artifact in Digital Forensics? - Cyber Centaurs
    In the realm of digital forensics, an artifact is any piece of information stored on a digital device that provides insights into the usage and activities ...
  74. [74]
    [PDF] Guide to Integrating Forensic Techniques into Incident Response
    The guidelines and procedures should support the admissibility of evidence into legal proceedings, including information on gathering and handling evidence ...
  75. [75]
    Developing an industry, creating the experts - OpenText Blogs
    Apr 11, 2019 · When EnCase Forensic launched in 1998, there was only one other solution attempting to help make sense of digital evidence. However, it only ran ...
  76. [76]
    [PDF] Windows Registry Forensic Tool Test Assertions and
    Jun 1, 2018 · WRT-AO-05 If a Windows registry forensic tool provides the user with the ability to extract registry forensic artifacts well-known in the field ...
  77. [77]
    The Truth About USB Device Serial Numbers | SANS Institute
    Nov 1, 2023 · As an aside, it might come as a surprise to many forensicators that the USBSTOR key does NOT contain all USB devices that have been attached.
  78. [78]
    Forensic Value of Prefetch - SANS Internet Storm Center
    Oct 20, 2022 · Found in C:\Windows\Prefetch by default, prefetch files (.pf) contain a wealth of information that can prove vital to any investigation.
  79. [79]
    [PDF] Digital Forensics and Incident Response (DFIR) Framework for ...
    Network Full content traffic – A complete packet capture of network traffic, whether it is Serial-based or Ethernet-based data. Usually, this type of data ...
  80. [80]
    Leveraging the Windows Amcache.hve File in Forensic Investigations
    The Amcache.hve is a registry hive file that is created by Microsoft® Windows® to store the information related to execution of programs.
  81. [81]
    [PDF] Advancing coordinated cyber-investigations and tool interoperability ...
    This paper features a proof-of-concept implementation using the open source forensic framework named plaso to export data to CASE. Community members are ...
  82. [82]
    [PDF] An Empirical Comparison of Widely Adopted Hash Functions in ...
    May 19, 2025 · For instance, to ensure the integrity of digital evidence in court, a forensic examiner traditionally computes the hash digest of the entire ...
  83. [83]
    File Carving - Computer Forensics Tools & Techniques Catalog
    File carving is searching for and reconstructing files based on content, rather than file system metadata.
  84. [84]
    Snapshot: S&T is Enhancing the Autopsy Digital Forensics Tool
    Dec 12, 2017 · Autopsy—an open-source, digital forensics platform used by law enforcement agencies worldwide to determine how a digital device was used in a ...
  85. [85]
    [PDF] The Evolution of Volatile Memory Forensics - OSTI.gov
    Sep 2, 2022 · Several tools, including Volatility and Rekall, have been created to parse the memory dumps. Some older, more established methods for memory ...
  86. [86]
    [PDF] Anti-Forensics: Techniques, Detection and Countermeasures
    Anti-Forensics (AF) tools and techniques frustrate CFTs by erasing or altering information; creating “chaff” that wastes time and hides information; ...
  87. [87]
    [PDF] Bit Rot and Silent Data Corruption in Digital Audiovisual Preservation
    Bit Rot and Silent Data Corruption in Digital Audiovisual Preservation. Jeffrey Lauber. Introduction. As atoms decay, so too do bits. Though relatively mild ...
  88. [88]
    Fundamentals of AV Preservation - Chapter 4 - NEDCC
    Obsolescence can happen at the file or program level (such as Real Media and WordStar) or with the media on which the file is stored (such as Zip drives).
  89. [89]
    Preservation issues - Digital Preservation Handbook
    But some repositories still face significant challenges in developing and maintaining scalable architectures and procedures to handle growing quantities of data ...
  90. [90]
    [PDF] Current Privacy Concerns with Digital Forensics - Faculty
    Sanctions for noncompliance are an important part of the GDPR. Article 83 says they can be (1) warnings in cases of first and unintentional non-compliance, (2) ...
  91. [91]
    Safeguarding Cultural Heritage in the Digital Era – A Critical ...
    Aug 21, 2023 · This paper explores the disruptive impact of digitization on cultural heritage preservation, focusing on the challenges posed by intellectual property rights.
  92. [92]
    Achieving Interoperability at the Record and Repository Levels
    The major challenge in converting records prepared according to a particular metadata scheme into records based on another schema (see Figure 1) is how to ...
  93. [93]
    [PDF] The Significance of Storage in the “Cost of Risk” of Digital Preservation
    Abstract. As storage costs drop, storage is becoming the lowest cost in a digital repository – and the biggest risk. We examine current modelling of costs ...
  94. [94]
    [PDF] Lessons learned review of the WannaCry Ransomware Cyber Attack
    Feb 1, 2018 · On Friday 12 May 2017, a global ransomware attack, known as WannaCry, affected a wide range of countries and sectors. Although WannaCry impacted ...
  95. [95]
    [PDF] THE YAHOO DATA BREACH - American University Law Review
    Oct 5, 2017 · Social media and electronic commerce websites face significant risk factors, and an acquirer may inherit cyber liability and vulnerabilities.
  96. [96]
    [PDF] Semantic Image Inpainting With Deep Generative Models
    The current GAN model in this paper works well for relatively simple structures like faces, but is too small to represent complex scenes in the world. Conve ...
  97. [97]
    JPEG AI Image Compression Visual Artifacts: Detection Methods ...
    Nov 11, 2024 · In this work, we propose methods to separately detect three types of artifacts (texture and boundary degradation, color change, and text corruption)
  98. [98]
    (PDF) GenAI in Digital Forensics: Enhancing Timeline ...
    Mar 25, 2025 · This paper explores the integration of Generative Artificial Intelligence (GenAI) into digital forensic workflows, focusing on the enhancement of timeline ...
  99. [99]
    Q&A: Bitcoin Ordinals, Inscriptions, and Digital Artifacts
    Apr 20, 2023 · Since early 2023, the Bitcoin community has been abuzz with the introduction of “digital artifacts,” or essentially non-fungible tokens (NFTs) ...
  100. [100]
    Beyond the bubble: Will NFTs and digital proof of ownership ...
    Non-fungible Tokens (NFTs) are blockchain-enabled cryptographic assets that represent proof-of-ownership for digital objects. The use of NFTs has been ...
  101. [101]
    Three-Dimensional Printing and 3D Scanning - MDPI
    Three-dimensional scanning is used to create detailed digital models of cultural heritage sites, artifacts, and monuments, which can be used for research, ...
  102. [102]
    DNA storage: research landscape and future prospects - PMC - NIH
    The figure shows seminal publications in the history of research on DNA ... data storage densities achieved in major DNA data storage publications since 2012.
  103. [103]
    Web-archiving and social media: an exploratory analysis
    Jun 22, 2021 · The aim of this article is to study the landscape of born-digital collections in heritage institutions and academia, in particular social media archiving (SMA) ...
  104. [104]
    Heritage preservation in quantum computing era - Nature
    May 22, 2025 · Once they are in digital form, cryptography is a key means of preserving them. However, due to advances in quantum computing, many of the ...
  105. [105]
    Augmented Reality Market Size, Share & Trends Report 2030
    The global augmented reality market size was estimated at USD 83.65 billion in 2024 and is projected to reach USD 599.59 billion by 2030, growing at a CAGR of ...Missing: hybrid artifacts<|control11|><|separator|>