Fact-checked by Grok 2 weeks ago

Buffer underrun

A buffer underrun, also known as a buffer underflow, is a condition in where is read from a at a faster rate than it is supplied, resulting in the buffer becoming empty and halting the associated . This error commonly arises in real-time data transfer scenarios, such as recording, playback, and network communications, where consistent data flow is essential to avoid interruptions. In the context of and DVD burning, a buffer underrun happens when the computer's data supply to the writer cannot keep pace with the high-speed recording process, causing the to deplete and typically ruining the disc by creating an incomplete or erroneous burn. To mitigate this issue, drive manufacturers developed buffer underrun protection technologies in the late 1990s, including Sanyo's BURN-Proof and Yamaha's SafeBurn, which detect impending underruns and temporarily pause recording to allow buffer refilling without media damage. Beyond storage media, buffer underruns impact audio/video decoding and streaming, where they manifest as audio dropouts, video freezes, or playback stuttering due to depleted buffers in decoders or players. In networking and embedded systems, such as serial communications or DMA transfers, underruns can lead to data loss or system stalls if input rates lag behind output demands. From a software security perspective, buffer underflow—a related vulnerability—involves writing data to memory locations preceding a buffer's start, potentially enabling attackers to execute arbitrary code or cause denial-of-service conditions. Prevention strategies across these domains include enlarging buffer sizes, optimizing system resources like defragmenting storage drives, and using error-handling mechanisms to ensure reliable data pipelines.

Fundamentals

Definition and Mechanism

A buffer underrun, also referred to as a buffer underflow, occurs when a or device depletes the data in a temporary faster than a or source can replenish it, leading to an empty and subsequent interruptions, pauses, or errors in the . This condition arises in producer-consumer systems where the serves as an intermediary to balance differing rates of data generation and . The mechanism of a buffer underrun unfolds in a structured sequence within pipelines. Initially, the producer continuously fills the with , maintaining a level that accommodates the consumer's needs. As long as the production rate matches or exceeds the consumption rate, the remains adequately stocked, enabling smooth operation. However, when the consumer's read rate surpasses the producer's write rate—due to inherent speed disparities—the 's contents diminish progressively. Upon reaching emptiness, the system detects the underrun, typically through a or signal, triggering an interruption in the consumer's activity, such as halting output or inserting gaps in the stream to prevent further errors. Conceptually, this process can be visualized as a linear : moves from the producer to the (a fixed-size or ), then to the consumer, with the underrun manifesting at the point where the 's read pointer overtakes the write pointer, signaling depletion. This is the counterpart to , in which the producer overwhelms the 's capacity. The term "buffer underrun" was popularized in the 1990s alongside the rise of consumer CD burning technology, where such errors could render recording sessions unusable. The broader concept of buffers for managing asynchronous data transfer, however, originated in early computing systems of the 1940s. Buffer overflow occurs when a attempts to write more data to a than its allocated capacity can accommodate, leading to data overwriting adjacent memory locations, potential crashes, or vulnerabilities such as attacks. This contrasts with buffer underrun, where the issue stems from insufficient data supply for consumption rather than excess input overwhelming storage. In scenarios, the rapid influx of data from sources like user inputs or packets exceeds buffer limits, often in programming contexts where bounds checking is inadequate. Buffer overrun is frequently used interchangeably with and refers to the act of writing beyond the designated boundaries of a , which can corrupt program state or enable exploits by altering . Unlike underrun's focus on depleted buffers in output streams, overrun (or ) emphasizes boundary violations in , commonly exploited in or heap-based attacks to execute arbitrary .
Error TypeCauseEffectCommon Domains
Buffer Underrun reads faster than supplies it, emptying the buffer prematurely.Pauses in output, gaps, or process failure (e.g., incomplete recordings). media processing, such as audio/video playback and burning.
Buffer Overflow (also known as ) writes more than the buffer's capacity, exceeding storage limits, often due to unchecked pointers or indices., program crashes, or breaches via corruption and adjacent overwrite., network protocols, , in C/C++ programs, embedded systems, and cybersecurity.
The term "buffer underrun" emerged distinctly in the late 1990s alongside the rise of consumer /DVD recording technologies, where continuous data streams were critical for uninterrupted burning processes, differentiating it from the earlier "" concept rooted in programming errors documented since the 1970s and popularized through security incidents like the 1988 . This evolution reflects a shift toward output-bound systems in media handling versus the input-bound memory issues prevalent in general .

Causes

Producer-Consumer Imbalance

Producer-consumer imbalance in systems occurs when the rate of fails to keep pace with the rate of consumption, resulting in the depleting faster than it is replenished and leading to underrun. This mismatch disrupts continuous flow, particularly in applications where uninterrupted processing is essential. Producer delays exacerbate this imbalance through factors that hinder timely generation. Slow generation, such as encoding bottlenecks in video or audio processing, arises from high computational demands that reduce the rate below expectations. Intermittent sources, exemplified by streams, introduce fluctuations in arrival rates, creating temporary shortages that deplete the during low- periods. Consumer overconsumption further contributes by demanding at a rate exceeding supply, often due to fixed-rate requirements in output . In playback scenarios, such as audio or video rendering, the maintains a high-speed output to ensure smooth delivery, which can outpace slower or variable input, causing the to empty prematurely. Synchronization issues between producer and , particularly the absence of effective handshaking mechanisms, lead to desynchronized rates and heighten underrun risks. Without proper signaling to coordinate readiness and , the may attempt to from an insufficiently filled , amplifying rate discrepancies./04:_Strategies_and_Interface_I_O/4.01:_Fundamentals_I_O-_handshake_and_buffering) This dynamic can be represented mathematically through the of level over time: B(t) = B_0 + \int_0^t (P(\tau) - C(\tau)) \, d\tau where B_0 denotes the initial level, P(\tau) is the instantaneous production rate, and C(\tau) is the instantaneous consumption rate. Underrun manifests when B(t) < 0, indicating the has been exhausted.

Resource Limitations

bottlenecks significantly contribute to underrun by limiting the rate at which can be supplied to the , often in scenarios requiring continuous flow such as recording. Slow devices, particularly hard disk drives (HDDs) with high seek times and low sustained rates, fail to deliver quickly enough, causing the to deplete during high-speed operations. For instance, in CD burning processes, HDD rates below the required for the burn speed—typically around 1.2 MB/s for 8x writing—can interrupt the , leading to underruns. Insufficient CPU cycles further exacerbate this by delaying and preparation, as the cannot allocate enough computational resources to keep pace with consumer demands in applications. Peripheral speed mismatches, such as connecting a high-speed CD writer to a USB 1.1 limited to 12 Mb/s, create additional constraints by throttling overall throughput. Software inefficiencies in can starve producer tasks, indirectly causing buffer underrun through delayed data generation. Poorly implemented threading may lead to contention and delays, preventing timely buffer refills, while handling latencies from competing system events can interrupt flow in or contexts. Operating system scheduling priorities that deprioritize producer processes—such as in cases of —allow higher-priority tasks to monopolize resources, leaving the buffer insufficiently replenished during critical periods. Environmental factors like overheating trigger hardware-level throttling to prevent damage, reducing CPU or peripheral performance and contributing to buffer underrun. Power-saving modes, by dynamically lowering clock speeds or suspending components, similarly diminish available cycles for data handling, particularly in or battery-constrained systems. A notable from the early involves CD burners, where widespread failures occurred due to HDD speeds inadequate for emerging high-speed writing requirements. Prior to the introduction of buffer underrun protection technologies like Sanyo's BURN-Proof in 1999, systems with HDD transfer rates below 8x (approximately 1.2 MB/s) frequently experienced underruns during burns at 8x or higher, rendering discs unusable as the could not pause without error. This issue was prevalent in home and office setups with HDDs averaging 10-20 MB/s burst but lower sustained rates, prompting the rapid adoption of mitigation features by 2000 to accommodate typical hardware limitations.

Prevention and Mitigation

Buffer Sizing and Management

Buffer sizing and management involve determining the appropriate capacity and adjustment mechanisms for buffers to accommodate variations in data production and consumption rates, thereby preventing underruns where the consumer outpaces the producer. Static sizing employs fixed buffer capacities suitable for environments with predictable data loads, such as systems with constant , ensuring consistent allocation without overhead from resizing operations. In contrast, dynamic sizing allows buffers to adapt by growing or shrinking based on monitoring of input and output rates, which is particularly effective in variable-rate scenarios like network streaming where traffic fluctuations can lead to temporary imbalances. Guidelines for optimal buffer sizing often derive from the maximum expected between and rates over the system's . This approach ensures the can absorb temporary shortfalls without depleting entirely. For applications, such as audio or video playback, practical implementations typically allocate 1-2 seconds worth of —e.g., approximately 176 for 1 second of uncompressed 16-bit audio at 44.1 kHz—to provide resilience against brief interruptions while minimizing playback delay. Circular buffering implements these sizes efficiently for continuous data streams by treating the buffer as a ring structure, where write and read pointers wrap around upon reaching the end, enabling overwrite of the oldest data without reallocating or shifting . This reduces computational waste and in producer-consumer pipelines, as pointers advance the buffer length, facilitating seamless handling of ongoing inputs like sensor data or media decoding and thereby lowering the risk of underruns in steady-state operations. Effective management incorporates monitoring tools that track metrics such as fill percentage—the ratio of current occupancy to total capacity—to detect impending underruns and trigger proactive adjustments, like rate throttling or alerts. In software frameworks for streaming, these metrics enable algorithms to maintain fill levels above a (e.g., 20-30%) by dynamically refilling during low-occupancy states, ensuring uninterrupted flow without overprovisioning resources.

Real-Time Scheduling Techniques

Real-time scheduling techniques in operating systems prioritize the timely execution of tasks involved in data production and consumption to mitigate buffer underruns, particularly in scenarios where producer-consumer imbalances can lead to depleted buffers. These methods leverage kernel-level mechanisms to allocate CPU resources predictably, ensuring that producer threads—responsible for generating data—execute with minimal delay relative to consumer threads that deplete the buffer. By enforcing deadlines and minimizing preemption from non-critical tasks, such techniques maintain buffer integrity in time-sensitive applications like audio processing. Priority scheduling is a core approach, where higher operating system priorities are assigned to producer threads to guarantee their execution before lower-priority consumers, thus preventing underruns caused by delayed data filling. In Linux kernels using the patchset (mainlined as of 2024), this priority-based preemptive scheduling allocates explicit CPU time to tasks, enabling them to meet strict deadlines and eliminate buffer underruns in audio and graphics subsystems. For instance, the kernel's group scheduling extends this by bounding CPU usage for task groups, further stabilizing performance in environments. Interrupt coalescing, also known as interrupt moderation, optimizes hardware handling by grouping multiple events into fewer interrupts, striking a balance between reduced CPU overhead (for throughput) and controlled to avoid processing stalls that exacerbate underruns. In audio drivers, this technique prevents excessive frequency from overwhelming the system while ensuring timely refills, as overly aggressive coalescing could delay responses and risk underruns, whereas minimal coalescing maintains low at the cost of higher CPU utilization. documentation highlights its role in network drivers, where it similarly mitigates -induced issues, a principle applicable to audio operations. Feedback loops employ dynamic algorithms to monitor buffer levels and adjust production rates in real time, compensating for clock drifts or varying workloads to sustain equilibrium between producers and consumers. In audio drivers, particularly for USB devices in asynchronous mode, explicit feedback endpoints send signals from the sink to the host, requesting adjustments in sample transmission rates (e.g., from 48 to 47 or 49 samples per frame) based on buffer occupancy, thereby averting underruns or overruns without hardware PLL synchronization. Implementations often incorporate proportional-integral-derivative (PID) control, where proportional terms respond to current buffer deviations, integral terms eliminate steady-state errors from clock mismatches, and derivative terms anticipate changes, as seen in embedded audio systems for precise rate adaptation. A practical implementation is the Windows Audio Session (WASAPI), which utilizes event-driven scheduling in exclusive mode to notify applications precisely when endpoint buffers require data, enabling just-in-time refilling that minimizes and prevents underruns during sound playback. Through the IAudioClient interface, developers specify an event handle during stream initialization, allowing the audio engine to signal buffer thresholds; this ensures consistent data flow in low-latency scenarios, such as professional audio software, by aligning application callbacks with hardware demands.

Applications in Media

Optical Disc Recording

In optical disc recording, the process relies on a continuous to the drive's , which feeds the mechanism etching pits and lands onto the surface to represent . If the underruns—depleting faster than it refills—the must halt abruptly to avoid writing errors, resulting in an incomplete and unusable commonly called a "coaster." This failure mode is particularly acute in write-once formats like and DVD-R, where the organic dye layer cannot be rewritten without physical damage. Early CD burners, prevalent before 2000, typically featured small onboard buffers of 128 to 2 , which proved insufficient at write speeds beyond (approximately 600 /s), as host systems struggled to sustain data transfer rates amid interruptions like hard drive seeks or CPU load. These limitations often led to frequent during burns at 8x or higher, prompting users to restrict speeds to 1x–4x for reliability. The adoption of larger 8 or greater buffers in mid-to-late drives provided a deeper reservoir, allowing brief pauses in data inflow without immediate underrun, though it did not fully eliminate the risk in resource-constrained environments. To combat persistent underrun issues, drive manufacturers developed underrun protection technologies that enable the laser to temporarily suspend writing at predefined points—such as between packets—without compromising integrity, then resume once the buffer refills. Plextor's of Sanyo's BURN-Proof, debuting in 2000 with the PlexWriter 12/10/32A ATAPI drive, pioneered this approach by detecting impending underruns and inserting seamless "power gaps" in the recording stream, adhering to audio and Orange Book standards. Subsequent implementations, like Sanyo's BURN-Proof and Ricoh's Just Link, followed similar principles, becoming standard in drives by the early and drastically reducing coaster rates even at 16x–52x speeds. In modern contexts, buffer underruns during optical disc recording are uncommon, thanks to SSD-based storage offering sustained read speeds exceeding 500 MB/s and system buffers augmented by gigabytes of , which far outpace the 11–54 MB/s demands of /DVD/Blu-ray writes. However, they persist in niche archival applications, such as burning large datasets from legacy hard drives or under heavy multitasking, where protection technologies remain essential safeguards.

Audio and Video Playback

In audio playback, buffer underrun manifests as the (DAC) running out of data, resulting in audible artifacts like pops and clicks that interrupt the continuous stream required for smooth reproduction. This issue arises because systems must deliver samples at precise intervals, such as the 44.1 kHz rate standard for CD-quality audio, where any delay in buffer replenishment causes the DAC to output silence or erroneous values until new data arrives. For video playback, buffer underrun leads to frame drops or temporary freezes in the as it exhausts its queue of before new ones are processed, compromising visual and often desynchronizing audio and video elements, which disrupts lip-sync in streams. These effects are particularly noticeable in scenarios demanding decoding, where the playback device cannot sustain the data flow needed to match the . Buffer underruns frequently occur in low-latency live audio environments, such as DJ software, where minimal processing delay is essential for mixing, but interruptions from CPU overload or inefficient drivers cause glitches like crackles. Similarly, attempting video playback on underpowered devices exacerbates the problem, as limited computational resources fail to decode and buffer frames quickly enough, leading to stutters during high-demand sequences. To address these challenges in audio, introduced ASIO drivers in 1997, enabling direct hardware access to bypass operating overhead and achieve sub-millisecond latencies, thereby minimizing underruns in professional setups.

Modern Contexts

Streaming and Networking

In streaming and networking contexts, buffer underrun occurs when network variability, such as and , depletes the playback buffer faster than data can be received, leading to interruptions in applications like VoIP and video calls. , defined as the variation in packet arrival times, disrupts the sequential delivery of audio or video packets, causing the buffer to empty if not adequately compensated by jitter buffers that temporarily store and reorder packets. For instance, in VoIP systems, jitter exceeding typical acceptable levels (often around 30 ms) can result in audible distortions, as processors (DSPs) in routers have limited capacity to smooth out delays. Similarly, during video conferencing, —common in platforms like —amplifies , where even low packet loss rates can contribute to buffer depletion and playback stalls, particularly on unstable connections. To mitigate underruns in bandwidth-fluctuating environments, protocols like (HLS) and () dynamically adjust video quality by selecting lower-bitrate segments when throughput drops, thereby maintaining buffer levels and preventing depletion. These protocols segment content into short chunks (typically 2-10 seconds) and use client-side heuristics to estimate available , switching resolutions to avoid rebuffering events that occur when the buffer falls below a critical . Common buffer targets in video-on-demand services range from 20-30 seconds to provide a safety margin against transient network dips, allowing seamless playback without excessive . In practice, algorithms like those in MPEG- detect congestion early and downshift bitrates—e.g., from 2 Mbps to 950 Kbps—to refill the buffer proactively. Recent advancements in and have reduced overall in streaming to under 20 ms for many applications, enabling smoother data transmission through localized processing at network edges, but underrun risks persist in high-demand scenarios like AR/ streaming due to residual variability in mobile networks. Post-2020 developments, including Mobile Edge Computing (MEC) integration with , offload rendering and caching closer to users, supporting immersive content at bitrates over 50 Mbps; however, sudden handovers or interference can still cause packet delays, depleting buffers optimized for low- (e.g., 5-10 seconds) rather than deep queuing. Surveys highlight that while boosts throughput by 10-20x over , AR/ streams remain vulnerable to underruns without adaptive caching strategies that predict and preload tiles based on user movements. A prominent example is Netflix's implementation of machine learning-driven predictive buffering in its adaptive streaming pipeline, which forecasts fluctuations up to 30 seconds ahead to preemptively adjust segment downloads and avoid underruns. By analyzing historical network patterns and real-time metrics, the system significantly reduces rebuffering frequency compared to traditional throughput-based , ensuring consistent during variable conditions like peak-hour congestion. This approach, detailed in Netflix's analyses, integrates with DASH-like protocols to maintain buffer health without over-provisioning data.

Real-Time Systems and Gaming

In real-time operating systems (RTOS) such as and TI-RTOS, buffer underruns occur when audio or data processing tasks fail to replenish s in time, leading to missed deadlines and system interruptions. For instance, in automotive audio systems using processors like the OMAP-L138, underruns in the multichannel audio (McASP) manifest as transmission errors, where the empties before new data arrives, causing audible glitches or playback halts. These issues arise from insufficient task prioritization or , as RTOS kernels do not inherently detect underruns without application-level monitoring. In automotive , such failures can disrupt alerts or audio, potentially compounding with deadline misses in hard tasks. In video gaming, buffer underruns primarily affect audio rendering pipelines, resulting in synchronization issues and perceptible hitches that degrade immersion. In engines like , insufficiently fast audio callback processing causes the mixer buffer to starve, triggering underrun events logged as "Audio Buffer Underrun (starvation) detected." This leads to crackling, popping, or desynchronized sound effects relative to visuals, as seen in custom audio code where rendering lags behind the 48 kHz sample rate demands. For example, during intensive gameplay scenes with multiple sound sources, underruns manifest as brief audio dropouts, mimicking frame hitches and impacting player experience in titles relying on audio cues. Prevention involves optimizing buffer sizes and async occlusion to ensure rendering completes within callback deadlines. Recent advances in leverage for predictive buffering to preempt underruns amid variable . Platforms like employ -driven adaptive bitrate streaming and predictive maintenance to forecast network fluctuations, dynamically adjusting buffers to maintain smooth playback without underruns. As of 2025, platforms integrate models for real-time prediction, caching anticipated game assets proactively to reduce buffering delays by up to 30% in fluctuating connections. These techniques, including -optimized content delivery networks (CDNs), route data via low-latency paths, minimizing underrun risks in remote rendering pipelines. As of 2025, platforms like and employ -driven predictive techniques to mitigate underruns. In safety-critical applications, buffer underruns in RTOS can precipitate mechanisms by causing missed deadlines in time-sensitive data flows. In RTOS-based medical devices running like , underruns in sensor buffer processing may delay vital sign monitoring, triggering emergency shutdowns to prevent harm. Similarly, in drones, underruns during control loops can contribute to system instability, activating autonomous safety protocols. These implications underscore the need for robust scheduling to ensure deterministic behavior in cyber-physical systems.

References

  1. [1]
    What is buffer underflow? | Definition from TechTarget
    Nov 22, 2022 · A buffer underflow occurs when a buffer is fed information at a lower rate than it is being read. Learn about the issues this causes and how ...
  2. [2]
    What is Buffer Underrun? - Webopedia
    May 24, 2021 · A common problem that occurs when burning data into a CD. It happens when the computer is not supplying data quickly enough to the CD writer ...
  3. [3]
    11.4. Buffer Underrun Protection - PC Hardware in a Nutshell, 3rd ...
    BURN-Proof works by constantly monitoring the status of the CD writer's buffer to detect a potential buffer underrun condition. If the amount of buffered data ...
  4. [4]
    CWE-124: Buffer Underwrite ('Buffer Underflow') - MITRE Corporation
    The product writes to a buffer using an index or pointer that references a memory location prior to the beginning of the buffer.
  5. [5]
    Why a computer buffer is called a buffer
    Oct 25, 2010 · In early computers, a buffer cushioned the interaction between files and the computer's central processing unit.Missing: 1960s | Show results with:1960s
  6. [6]
    Buffer Overflow - OWASP Foundation
    A buffer overflow condition exists when a program attempts to put more data in a buffer than it can hold or when a program attempts to put data in a memory area ...
  7. [7]
    What is a Buffer Overflow | Attack Types and Prevention Methods
    A buffer overflow (or buffer overrun) occurs when the volume of data exceeds the storage capacity of the memory buffer.
  8. [8]
  9. [9]
    What Is a Buffer Underrun? - Computer Hope
    Jun 22, 2024 · A buffer underrun is a problem encountered when creating or burning data onto a CD (Compact Disc) when the computer is not supplying data ...<|control11|><|separator|>
  10. [10]
    Buffer Overflow Attacks - Systems Encyclopedia
    Buffer overflow attacks are a class of software attack vectors created by the direct exploitation of undefined behavior caused by buffer overflows.Missing: underrun | Show results with:underrun
  11. [11]
    What's a Buffer? | Baeldung on Computer Science
    Mar 18, 2024 · 5.2. Underflow. A buffer underflow occurs when the producer is filling the buffer much slower than the consumer processes data.
  12. [12]
    [PDF] Operating System Support for Low-Latency Streaming
    component of output latency occurs due to a rate mismatch between the application's data rate and ... buffer underflow and thus immediately sends packets. However ...
  13. [13]
    The four elements of video performance - Mux
    Aug 18, 2016 · Rebuffering is stalling in the middle of playback due to a buffer underrun. In other words, the video stream is loading slower than the video ...
  14. [14]
    How is a hard-drive a bottleneck when burning a CD or DVD?
    Jun 11, 2011 · The main reason that we recommend substantially faster hard drives to prevent buffer under-runs is because a faster drive will be able to get past the blocking ...
  15. [15]
    [PDF] CRW-70 (GB)
    For reliable data writing, Buffer Underrun Protection should be activated and the writing speed should be set to 8X on the CD writing software. With a USB 1.1 ...
  16. [16]
    Avoid priority inversion - Android Open Source Project
    Oct 9, 2025 · Reliable scheduling allows the audio buffer sizes and counts to be reduced while still avoiding underruns and overruns. Priority inversion.
  17. [17]
    How to Disable CPU Throttling in Windows | NinjaOne
    Mar 31, 2025 · Disable throttling by selecting a High Performance power plan and setting both Minimum and Maximum processor state to 100% in Power Options.<|control11|><|separator|>
  18. [18]
    2000 Site Updates - Pctechguide.com
    CD-RW: Section added on Sanyo's BURN-Proof technology, which renders the dreaded CD burning buffer underrun condition a thing of the past;. DVD: Section on ...
  19. [19]
    [PDF] Designing Packet Buffers for Router Linecards - McKeown Group
    Dynamical versus Static Allocation:.We assume that the whole packet buffer emulates a packet buffer with multiple. FIFO queues, where each queue can be ...
  20. [20]
    Buffer Management for Wireless Media Streaming - ResearchGate
    These techniques try to ensure the availability of few seconds of media segments in the buffer in order to avoid buffer under-run, resulting in frequent stalls ...
  21. [21]
    [PDF] Computation of Buffer Capacities for Throughput Constrained and ...
    consumption times and the upper bound on space produc- tion times, and the production and consumption rates. We will show that these buffer capacities are ...
  22. [22]
    Circular Buffering - an overview | ScienceDirect Topics
    Circular buffering in DSP is when a buffer wraps around to the beginning after the end, allowing efficient data management without manual pointer resets.
  23. [23]
    Creating a Circular Buffer in C and C++ - Embedded Artistry
    May 17, 2017 · Circular buffers (also known as ring buffers) are fixed-size buffers that work as if the memory is contiguous & circular in nature.Missing: streams underrun
  24. [24]
    Buffer Count and Buffer Fill: Improve Video Streaming Quality - FastPix
    Oct 23, 2025 · To reduce buffering, use adaptive bitrate streaming, pre-buffer content before playback, and implement efficient buffer management algorithms.
  25. [25]
    Using the Buffer to Avoid Rebuffers: Evidence from a Large Video ...
    Feb 10, 2015 · Our results show that by doing away with estimating network capacity and instead focusing on buffer occupancy, we can reduce rebuffer rates by ...
  26. [26]
    Real-Time group scheduling - The Linux Kernel documentation
    Because real-time tasks have explicitly allocated the CPU time they need to perform their tasks, buffer underruns in the graphics or audio can be eliminated.
  27. [27]
    Real-Time group scheduling - The Linux Kernel Archives
    Because realtime tasks have explicitly allocated the CPU time they need to perform their tasks, buffer underruns in the graphics or audio can be eliminated.Missing: prevent | Show results with:prevent
  28. [28]
    Interrupt Moderation - Windows drivers - Microsoft Learn
    Jan 31, 2025 · With interrupt moderation, the NIC hardware doesn't generate an interrupt immediately after it receives a packet. Instead, the hardware waits for more packets ...Missing: audio | Show results with:audio
  29. [29]
    [PDF] USB Audio Simplified - Silicon Labs
    For asynchronous operation, the sink provides explicit feedback to the source. ... buffer overrun or underrun does not occur. This method is implemented in ...
  30. [30]
    USB audio (isochronous async EP + feedback EP) - s...
    I apply a linear (proportional) control function to compute the neccessary deviation from my average sample rate (269kHz) depending on the current buffer level ...
  31. [31]
    About WASAPI - Win32 apps - Microsoft Learn
    Jul 25, 2025 · The Windows Audio Session API (WASAPI) enables client applications to manage the flow of audio data between the application and an audio endpoint device.Missing: driven scheduling underrun
  32. [32]
    Buffer-Underrun protection - Burning CDs and DVDs - PCSTATS
    Most modern CD/DVD writers implement some form of buffer-underrun protection. A buffer-underrun occurs when the writer's buffer (high speed memory built into ...
  33. [33]
    How Buffer Underrun technology hinders CD Recording Performance
    Jan 14, 2002 · For the most part, all buffer underrun protection schemes achieve one goal: they stop resource starvation.
  34. [34]
    [PDF] The Best Of Music World Computer World 2000.pdf - Vintage Apple
    • 128KB buffer size •Transfer CD-ROM data at up to 7800KB/sec •Windows<!> 95/98/NT/2000 compatible. $6999 (ClN MK4108). HewleH-Packarcl. CD-Writer Plus 9300i.
  35. [35]
    Tech Flashback: CD-R/CD-RW Overburning & My Results Database
    May 29, 2021 · By the time 16x burners arrived, the risk of buffer underrun was mitigated with technologies that allowed burners to pause recording and resume ...Missing: origin 1990s
  36. [36]
    Burner Basics: An Introduction to DVD Burners - Videomaker
    Buffer underruns were a common cause of CD-coasters a few short years ago, but modern technologies can easily deal with interruptions in the data flow, using ...Missing: protection Plextor Proof date
  37. [37]
    Plextor Drive History
    One of the first 8x CD-R drives in the world. PX-W4220T, 1999, 4x/2x/20x, SCSI, Entry model of W8220T. PX-W8220T, 1999, 8x/2x/20x, SCSI, First Plextor CD-R/RW ...
  38. [38]
    What is the 'burn proof" feature of my Plextor burner? - Sweetwater
    Apr 27, 2007 · The “burn proof” feature was designed to protect against buffer under run errors and still conform to Red Book specifications.Missing: DVD | Show results with:DVD
  39. [39]
    buffer underrun when using ssd? | AnandTech Forums
    Aug 30, 2013 · Windows does have horrible support for I/O priority, but DVD burning is both not a high-bandwidth task, and should be buffered by ImgBurn in ...Missing: relevance modern Blu- ray CD
  40. [40]
  41. [41]
    Audio Frame - an overview | ScienceDirect Topics
    4.2 Lip Sync. Audio codecs generally involve less delay than video codecs ... Video artifacts are relatively easy to detect based on playout buffer underrun ...
  42. [42]
    [PDF] Low Latency Audio Processing - QMRO Home
    fessional digital audio hardware, it may suffer losses of audio signals when the CPU is fully loaded with audio processing tasks due to the buffer underrun.
  43. [43]
    Optimizing FL Studio Performance
    If your CPU load climbs too high, you will hear clicks, pops or stuttering in the live audio. This is known as a buffer underrun.
  44. [44]
    Audio Interfaces: The Best Sound for Creatives - Steinberg
    The ASIO interface standard developed by Steinberg over 30 years ago has become an integral part of modern recording studios, making it possible to connect ...UR22C · UR-C Series · Compare · UR44C - 6x4
  45. [45]
    Understanding Jitter in Packet Voice Networks (Cisco IOS Platforms)
    Feb 2, 2006 · Jitter is a variation in packet latency for voice packets. The DSPs inside the router can make up for some jitter, but can be overcome by excessive jitter.
  46. [46]
    What is a Jitter Buffer and How It Works - Obkio
    Rating 4.9 (161) Jul 2, 2025 · A jitter buffer is a safety net that catches incoming data packets and holds them to smooth out inconsistent delivery times.
  47. [47]
    What is jitter on a speed test and how do you fix it? - Zoom
    Feb 14, 2025 · Network jitter causes choppiness and distortion during calls and issues with video meetings. Learn what causes jitter and how to fix it ...What Is Jitter? · How To Reduce Jitter · Reduce Jitter With Zoom...Missing: underrun | Show results with:underrun
  48. [48]
    [PDF] A Practical Evaluation of Rate Adaptation Algorithms in HTTP-based ...
    This method is to prevent buffer underflow in case of abrupt bandwidth drops. ... [37] Setup Adaptive Bitrate Streaming with DASH and HLS., https://bitmovin.com/.
  49. [49]
    Happy Music Streaming with MPEG-DASH | Sonos Tech Blog
    Apr 4, 2022 · The ABR algorithm is able detect this and quickly down-shifts the bitrate to 950 Kbps so that it can prevent a playback buffer underrun. After ...
  50. [50]
    Part 3: How to compete with broadcast latency using current ...
    Jun 14, 2018 · This open source HLS and DASH player for MSE environments provides several parameter options that can be modified in order to achieve lower ...Video Playback Optimizations · Open Source Players · Hls. Js
  51. [51]
    (PDF) A Survey on Mobile Edge Computing for Video Streaming
    5G communication brings substantial improvements in the quality of service provided to various applications by achieving higher throughput and lower latency ...
  52. [52]
    [PDF] Edge Caching and Computing in 5G for Mobile AR/VR and Tactile ...
    Mar 27, 2019 · Edge caching stores content closer to users, reducing latency. Edge computing brings resources close to users, reducing backhaul load. Edge ...
  53. [53]
    Intelligent cache and buffer optimization for mobile VR adaptive ...
    In this paper, a mobile VR video adaptive transmission mechanism based on intelligent caching and hierarchical buffering strategy in Mobile Edge Computing (MEC ...Missing: underrun risks
  54. [54]
    Using Machine Learning to Improve Streaming Quality at Netflix
    Mar 22, 2018 · In this blog post, we describe some of the technical challenges we face for video streaming at Netflix and how statistical models and machine learning ...
  55. [55]
    [1401.5174] Streaming Video over HTTP with Consistent Quality
    Jan 21, 2014 · The challenge, however, lies in how to balance these two variabilities to yield consistent video quality without risking a buffer underrun.
  56. [56]
    RTOS Fundamentals - FreeRTOS™
    The consequence of a missed deadline is greater for the control task than for the key handler task. The diagram below demonstrates how these tasks would be ...Missing: audio | Show results with:audio
  57. [57]
    RTOS/OMAP-L138: Audio underrun/overrun - Processors forum
    Oct 8, 2018 · Have a look at section 25.0.21.6.2 Buffer Underrun Error - Transmitter and Section 25.0.21.6.3 Buffer Overrun Error - Receiver and follow the ...
  58. [58]
  59. [59]
    [PDF] DRA75x, DRA74x SoC for Automotive Infotainment Silicon Revision ...
    The PU/PD contention applies a mid-supply voltage to the input buffer which may cause excessive current to flow through the input buffer. In this scenario ...<|separator|>
  60. [60]
    Best Practices for Avoiding Underruns in Custom Audio Code
    Mar 15, 2024 · This means that in the case of a slow audio engine, the audio API may request a new buffer before the audio engine has calculated enough data ...
  61. [61]
    Getting "LogAudioMixer: Display: Audio Buffer Underrun (starvation ...
    Jul 11, 2022 · The error “LogAudioMixer: Display: Audio Buffer Underrun (starvation) detected.” is coming up every few seconds in the output log upon opening the editor in 4. ...Missing: hitches | Show results with:hitches
  62. [62]
    Crackling only when I play games : r/buildapc - Reddit
    Feb 15, 2023 · That's likely the result of "buffer underrun", i.e. the CPU can't generate sound data fast enough for the sound card or deck to consume ...Help! Audio stuttering bug across many Unreal Engine games!Hitching in Editor and Major stuttering in Play mode : r/unrealengineMore results from www.reddit.com
  63. [63]
    Cloud Gaming Market Trends, Share and Forecast, 2025-2032
    Aug 21, 2025 · Microsoft employs AI algorithms for adaptive bit-rate streaming and predictive maintenance, ensuring reduced lag and consistent gameplay quality ...
  64. [64]
    Real-time latency prediction for cloud gaming applications
    In this paper, we present CLoud Application lAtency Prediction (CLAAP), a novel solution that, to tolerate challenged network conditions in gaming, predicts ...Missing: caching underrun
  65. [65]
    How AI-Powered CDNs Optimize Gaming Performance in Real Time
    Mar 14, 2025 · AI-powered CDNs reduce latency, optimize routing, and enhance real-time performance for seamless and responsive gaming experiences.
  66. [66]
    [PDF] System Auditing for Real-Time Systems
    Real-Time Systems (RTS) are an integral part of numerous safety and security-critical domains, including medical devices, autonomous vehicles, manufacturing ...<|separator|>
  67. [67]
    [PDF] Opportunistic Data Flow Integrity for Real-time Cyber-physical ...
    Real-time embedded systems play a critical role in modern society, as they are used to control and monitor various safety- critical processes, such as air ...
  68. [68]
    SoK: Security in Real-Time Systems | ACM Computing Surveys
    Security is an increasing concern for real-time systems (RTS). Over the last decade or so, researchers have demonstrated attacks and defenses aimed at such ...