Fact-checked by Grok 2 weeks ago

Frame rate

Frame rate is the frequency at which consecutive still images, known as , are captured or displayed in moving-image media such as , video, and , typically measured in per second (). This metric determines the smoothness and perceived quality of motion, with higher rates reducing blur and enhancing realism by more closely approximating continuous movement. In motion picture production, 24 fps emerged as the global standard in the late 1920s, driven by the need to synchronize projected film with optical soundtracks while minimizing costs and ensuring the persistence of vision illusion. This rate, originally selected as a compromise between technical feasibility and economic efficiency during the transition from silent films (often 16–18 fps) to "talkies," remains the benchmark for cinematic storytelling due to its characteristic motion aesthetic. For broadcast , regional standards vary: 25 fps for PAL systems in much of and , derived from 50 Hz power grids to avoid flicker, and 29.97 fps (non-drop or drop-frame variants) for in , adjusted from 30 fps to accommodate color encoding. These rates, formalized by organizations like the of Motion Picture and Television Engineers (SMPTE) and the (ITU), ensure compatibility across legacy analog and modern digital workflows. Higher frame rates, such as 48, 50, 60, or even 120 , are increasingly adopted in , , and like to capture fast action with greater clarity and reduce judder. SMPTE standards, including extensions in ST 12-3 for timecode, support these elevated rates up to 120 for applications requiring enhanced , such as and ultra-high-definition streaming. Frame rate selection influences not only visual fidelity but also storage, , and processing demands, with mismatches potentially causing artifacts like or desynchronization in and playback.

Fundamentals

Definition and Units

Frame rate is defined as the number of frames, or individual still images, displayed or captured per second in moving media such as film, video, or animation. This metric specifically quantifies the frequency at which consecutive frames are presented to create the illusion of motion, distinguishing it from shutter speed, which refers to the duration of exposure for each individual frame during capture, and refresh rate, which measures how frequently a display device updates its screen image, often independently of the source content's frame rate. The primary unit for frame rate is frames per second (fps), a standard adopted across modern film, television, and digital video production to ensure consistency in playback and synchronization. In early cinema, frame rates were occasionally notated in frames per minute (fpm) due to hand-cranked mechanisms that operated on a per-minute basis; for instance, the common 24 fps rate equates to 1440 fpm through simple multiplication by 60 seconds per minute. Notation for frame rates includes both integer values, such as 24 for cinematic standards or 30 for broadcast video, and fractional values like 23.976 , which is used for compatibility with television systems by slightly slowing 24 content to align with the 29.97 video rate derived from historical broadcast color encoding adjustments. Integer rates provide exact timing for simplicity, while fractional rates ensure seamless integration in legacy workflows without introducing audio pitch shifts or timing drifts. The frame rate can be calculated using the basic equation: \text{fps} = \frac{\text{total number of frames}}{\text{duration in seconds}} This formula allows for precise determination of the rate from recorded media duration and frame count, forming the foundation for standards in production and post-processing.

Measurement Techniques

For analog film, frame rate measurement traditionally involves counting the perforations along the film strip (e.g., 16 per frame in 35mm film) over a known length and correlating it with the mechanical speed of the camera or projector, often calibrated using tachometers or stroboscopic devices to verify consistent cranking or motor rates. In hand-cranked systems, practical checks included filming a reference clock or rotating object to assess playback smoothness against expected fps equivalents like 16-18 fps (approximately 960-1080 fpm). In and playback environments, frame rate measurement relies on specialized tools and instruments designed to analyze signal timing, , and frame sequences accurately. Hardware instruments such as waveform monitors are commonly used to visualize and levels alongside timing information in broadcast and live production settings, enabling verification of and rate consistency in SDI or signals. Oscilloscopes, particularly those with video-specific triggering capabilities, facilitate precise timing analysis of frame intervals by capturing electrical signal waveforms from video sources, helping detect deviations in frame delivery rates during . Software analyzers like MediaInfo provide a unified for extracting frame rate from file containers, displaying details such as average and real-time frame rates without full decoding. Similarly, FFprobe, part of the FFmpeg suite, offers command-line-based probing of streams to report frame rate metrics in both human-readable and parseable formats. Practical techniques for measuring frame rate include manual or automated frame counting over defined time intervals, where the total number of frames is divided by the duration to compute frames per second (fps). This method is effective for verifying constant frame rates (CFR) in offline and can be implemented using software tools that enumerate frames from decoded streams. High-speed camera is employed to identify discrepancies in recorded footage, such as irregular frame intervals caused by shutter variations or motion artifacts, by capturing the output at rates exceeding the source video's fps for temporal comparison. Software-based extraction from file formats further simplifies measurement; for instance, in MP4 containers, tools parse box structures like 'tkhd' or 'mdhd' to retrieve and timescale data, yielding the nominal frame rate embedded during encoding. metadata in certain camera-generated videos also stores frame rate tags, accessible via standard libraries for quick verification. Measuring frame rate presents challenges, particularly with variable frame rates (VFR) compared to CFR, as VFR videos exhibit fluctuating inter-frame that complicate accurate averaging and can lead to playback inconsistencies without proper handling. In compressed streams, such as H.264-encoded MP4s, direct frame rate assessment often requires partial or full decoding due to allocation affecting timestamp reliability, increasing computational demands and potential errors in metadata-only probes. VFR footage from consumer devices exacerbates these issues, as frame dropping or duplication during recording introduces uncertainty in speed and timing calculations, necessitating advanced forensic techniques for . A specific example of measuring fps in a digital video file involves using FFprobe via the command line for a file named input.mp4. First, run ffprobe -v quiet -print_format json -show_streams input.mp4 to output data, including the video stream's r_frame_rate (real-time rate, e.g., "30/1" for 30 ) and avg_frame_rate fields under the relevant stream object. For CFR verification, compute the effective rate by extracting frame count with ffprobe -v quiet -count_frames -select_streams v:0 -show_entries stream=nb_read_frames -of csv=p=0 input.mp4 (yielding the total frames) and duration via ffprobe -v quiet -show_entries format=duration -of csv=p=0 input.mp4 (in seconds), then divide frames by duration. This process confirms the frame rate without re-encoding, though for VFR, avg_frame_rate provides an approximation while full stream analysis may reveal variations.

Human Perception

Temporal Resolution Limits

The human visual system processes motion through mechanisms involving , where the retains an image for approximately 1/15 to 1/10 of a second (0.067 to 0.1 seconds) after stimulus offset, depending on brightness, and retinal refresh rates determined by photoreceptor response times. This persistence allows discrete to blend into perceived continuous motion when presented rapidly enough, with the critical flicker fusion (CFF) threshold marking the frequency at which appear steady. For stationary lights, the average CFF threshold is around 50-60 Hz, though it can reach 90 Hz under high and conditions. In motion scenarios, the effective threshold is lower due to reduced to temporal changes in moving stimuli. Temporal resolution, the eye's ability to distinguish successive visual events, varies widely among individuals and contexts, with studies indicating an average limit of 10-12 distinct per second for basic in standard conditions. Higher resolutions, up to 60 images per second or more, are achievable for some individuals, particularly experts like athletes or pilots, and with high-contrast or rapidly changing stimuli, where can extend beyond 100 Hz. Early experiments, such as those by Jan Evangelista Purkinje in 1823, demonstrated these limits through observations of afterimages and the persistence of visual impressions from brief flashes, laying foundational insights into how the eye integrates temporal signals. Several factors influence , including age, which lowers the CFF in older adults due to slowed neural processing; lighting conditions, where higher elevates the by shortening times; and the distinction between foveal and , with the fovea supporting higher rates (up to 60 Hz) compared to the (around 20-30 Hz) owing to denser photoreceptors. A basic model approximates the CFF as the of the photoreceptors' time, reflecting the over which visual signals are temporally summed before detection of fails: \text{Threshold frequency} \approx \frac{1}{\text{integration time of photoreceptors}} This equation, derived from retinal physiology, typically yields thresholds aligning with observed values of 50-60 Hz for integration times around 16-20 ms under photopic conditions.

Flicker and Motion Blur Effects

Flicker in video and film arises from the intermittent illumination during projection or display, leading to perceived brightness fluctuations when frame rates are low, such as below 24 frames per second (fps). At these rates, the human visual system detects the on-off cycles as strobing or judder, creating an unnatural, jerky motion appearance that disrupts smooth perception. This effect is exacerbated in early cinema projections, where single-blade shutters produced only 16-24 light pulses per second, making flicker highly noticeable. The 180-degree shutter rule addresses these issues by setting the shutter open time to half the frame interval, promoting natural motion portrayal through balanced that introduces sufficient to mask strobing while avoiding excessive . For a 24 production, this equates to a of approximately 1/48 second, blending frames perceptually into continuous movement. Motion blur occurs because each frame captures motion over a finite exposure time, smearing fast-moving objects and contributing to realistic perception when appropriately tuned. The duration of this blur is determined by the exposure time per frame, given by the formula: \text{Blur duration} = \frac{\text{shutter angle fraction}}{\text{frame rate}} where shutter angle fraction is the shutter angle in degrees divided by 360. For instance, at 24 fps with a 180-degree shutter (fraction = 0.5), the blur duration is 0.5 / 24 ≈ 1/48 second, which aligns with natural eye response for everyday motions. A classic example is the , an artifact where rotating spokes appear stationary or reversing direction if their rotational frequency aliases with the frame rate, as demonstrated in perceptual models of sampled motion. Early mitigation techniques included double-bladed (or twin-blade) shutters in projectors, which doubled the interruption to 48 cycles per second at 24 , rendering imperceptible above the critical fusion threshold of the . In video-to-film transfers, pulldown patterns like pulldown distribute frames to simulate even motion, reducing judder from mismatched rates such as 24 to 30 Hz displays. Modern LED displays face similar challenges from (PWM) dimming, which can induce low-frequency ; higher refresh rates (e.g., 120 Hz or above) and elevated PWM frequencies mitigate this by exceeding perceptual thresholds. Research on high frame rates (HFR) indicates preferences for rates like 48 , which diminish and judder compared to 24 , as evidenced in viewer studies where HFR footage scored higher in motion quality for dynamic scenes, though some report unnatural "hyper-realism" without adjusted shutter angles. These findings, drawn from controlled tests with expert and general audiences, highlight 48 as a threshold for reducing artifacts in action-heavy content, such as in HFR implementations for films like .

Historical Evolution

Silent Film Era

The origins of frame rate in cinema trace back to the late 19th century, with early experiments in motion picture technology emphasizing individual viewing devices over projection. Thomas Edison's Kinetograph camera, developed in the early 1890s, recorded images at approximately 40 frames per second (fps) on 35mm , while the accompanying peepshow viewer played back films at similar speeds, typically ranging from 40 to 46 fps to ensure smooth motion without excessive . These rates were chosen to balance perceptual smoothness with the mechanical limitations of the continuous-loop film transport system, though actual speeds varied due to manual operation. A pivotal advancement came in 1896 with , which enabled public screenings and operated at variable speeds, typically 16 to 24 fps, facilitating the transition from peephole devices to large-audience projection. This device used perforated 35mm film pulled intermittently by sprockets, with projector speeds dictated by the operator's hand-cranking to maintain consistent pull-down and reduce film wear. Meanwhile, the Lumière brothers in established a more economical standard with their Cinématographe in 1895, filming and projecting at 16 fps, which halved the film consumption compared to Edison's higher rates and made widespread production feasible. Throughout the silent era, frame rates remained variable, often between 16 and 18 fps for hand-cranked cameras and projectors, influenced by technical constraints such as the need for uniform perforation spacing on film stock to enable reliable sprocket engagement and transport. Projector mechanisms, reliant on manual cranking, introduced inconsistencies, while economic pressures—stemming from the high cost of raw celluloid film—encouraged lower rates to minimize material usage without compromising the illusion of motion. These factors resulted in non-standardized speeds across productions, with rates sometimes exceeding 20 fps for smoother effects in theatrical settings. Standardization efforts intensified in the mid-1920s as the Society of Motion Picture Engineers (SMPE, later SMPTE) sought uniformity for theatrical projection. In 1927, following surveys of existing practices, the organization adopted 24 fps as the standard for 35mm film, reflecting an average of observed speeds and preparing the industry for synchronized , though many silent films continued to vary until full implementation. This rate balanced perceptual needs with mechanical reliability, marking the close of the silent era's experimental phase.

Sound and Color Film Transitions

The introduction of synchronized sound to motion pictures in the late marked a pivotal shift in frame rate standards, as variable speeds common in the silent era—typically ranging from 16 to 22 frames per second—proved incompatible with the precise timing required for audio . Sound systems demanded a constant film speed to align image and audio tracks without distortion or drift, leading to the adoption of a fixed rate that balanced technical needs with production costs. The system, developed by and implemented by starting in 1926, established 24 frames per second () as the new benchmark for films. This rate, equivalent to 90 feet of per minute, was selected to ensure sufficient for optical and disc-based recording while minimizing usage compared to higher speeds. The landmark release of in 1927, the first feature-length with extensive synchronized , utilized Vitaphone at 24 fps, accelerating the industry's transition from silent production. Concurrently, Fox's Movietone optical process, introduced in 1927, also aligned with 24 fps to compete with Vitaphone, though early implementations occasionally varied slightly before standardization. By the early 1930s, the Society of Motion Picture Engineers (SMPE), in coordination with the Academy of Motion Picture Arts and Sciences, formalized 24 fps as the universal standard for sound cinema through technical bulletins and recommendations, solidifying its role in theatrical exhibition. (Note: The 1932 efforts primarily refined related specifications like aperture dimensions, but frame rate consensus built on prior sound-era agreements.) The arrival of color processes in and formats in the 1950s preserved this 24 fps foundation, ensuring compatibility across evolving technologies. Technicolor's three-strip system, debuting in the 1932 short , operated seamlessly at 24 fps, as the dye-transfer printing process did not alter the underlying frame rate mechanics. Similarly, the 1953 introduction of by 20th Century Fox maintained 24 fps while employing anamorphic lenses to achieve a 2.55:1 , allowing theaters to project sound-era films without speed adjustments. Central to these transitions was the need for audio-frame , where 24 provided an even count of per second, facilitating precise alignment with waveforms recorded on the film's optical track. This rate ensured that audio cycles—requiring consistent linear film for up to several kilohertz—matched image progression without slippage, a necessity absent in silent film's flexible speeds. Later adaptations, such as transferring 24 film to non-matching video formats, introduced pulldown patterns like the 2:3 sequence, which repeats (two fields from one frame, three from the next) to convert to 30 playback while preserving temporal integrity and pitch.

Broadcast Television Standards

Broadcast television standards for frame rates originated in the analog era, shaped by technical constraints and regional power grid frequencies. In the United States, the standard was approved by the in March 1941, establishing a frame rate of 30 frames per second for black-and-white transmissions, comprising two interlaced fields per frame at 60 fields per second. This rate aligned with the 60 Hz (AC) power supply prevalent in , minimizing visible flicker from electrical interference in early receivers. Experimental work on traces back to , but the 1941 approval marked its formal adoption for , effective from July 1, 1941. With the addition of color broadcasting in 1953, the frame rate was precisely adjusted to 29.97 frames per second (and 59.94 fields per second) to allocate spectrum for the color subcarrier without overlapping the , a change necessitated by the limitations of the existing infrastructure. In contrast, European standards developed later to address color transmission challenges. The Phase Alternating Line (PAL) system, introduced in on August 25, 1967, at the Radio Exhibition in , standardized at 25 frames per second with 50 interlaced fields per second. This rate synchronized with the 50 Hz grids common across much of , reducing hum bars and flicker effects similar to NTSC's design. The Sequential Color with Memory () standard, first broadcast in in 1967, also adopted 25 frames per second and 50 fields per second, using 625 scan lines like PAL but with a different color encoding method to enhance transmission stability over long distances. These 50 Hz-derived rates provided a smoother integration with local electrical systems, though they resulted in slightly slower motion portrayal compared to NTSC's higher cadence. The shift to in the late and early preserved these legacy frame rates for while introducing greater flexibility. The Advanced Television Systems Committee (ATSC) standard, implemented in the United States starting in 1995 and fully transitioned by 2009, retained 29.97 frames per second (59.94 fields per second) as the primary rate for standard-definition content, alongside support for fractional rates like 23.976 frames per second to accommodate 24 frames per second film transfers without temporal speedup or judder. Similarly, the standards, widely adopted in since 1995, maintained 25 frames per second (50 fields per second) to align with PAL and origins. These digital frameworks eliminated many analog artifacts but kept the regional divides rooted in power grid frequencies—60 Hz influencing North American rates and 50 Hz shaping European ones—to ensure seamless integration with existing equipment and content libraries. High-definition television (HDTV) upgrades further expanded options within these standards. ATSC for HDTV, rolled out progressively from 1998, supports 1080p progressive scan at frame rates of 24, 30, and 60 frames per second, enabling cinematic film emulation at 24 fps, broadcast-style delivery at 30 fps, and smoother motion for sports or action at 60 fps, all while accommodating the traditional 29.97 fps for compatibility. DVB-T and DVB-S implementations similarly allow 1080p at 25 and 50 frames per second, with optional 24 fps for international film content. These enhancements marked a departure from strict interlaced field rates, prioritizing progressive formats for improved vertical resolution, though global variations persist due to entrenched infrastructure.

Media Production Applications

Live-Action Film and Video

In live-action and , the standard frame rate of 24 frames per second () has long been established as the norm for achieving a cinematic aesthetic, characterized by a subtle that mimics the look of traditional projection. This rate provides a between visual continuity and the artistic "dreamy" quality desired in narrative filmmaking, where higher rates might eliminate the intended . For television and video content, 30 fps is commonly used in to deliver a more realistic and fluid motion suitable for broadcast standards, while 60 fps enhances smoothness for dynamic scenes like or sequences without altering the overall perceptual . Production decisions around frame rates often revolve around trading off motion smoothness against the preservation of a "filmic" , where lower rates like 24 fps introduce intentional motion artifacts that contribute to emotional immersion, contrasting with the hyper-real clarity of higher rates that can make scenes feel overly sharp or video-like. (HFR) experiments, such as the 48 fps used in Peter Jackson's The Hobbit: An Unexpected Journey (), aimed to reduce and increase detail for viewing but sparked debate over whether the heightened realism detracted from the cinematic illusion. For slow-motion effects, footage is typically captured at 120 fps or higher to allow for conformed playback at 24 fps, extending the duration of action sequences while maintaining temporal detail and avoiding judder. In production workflows, on-set monitoring often occurs at the native capture rate to ensure accurate assessment of motion and , with directors and cinematographers using tools like external recorders or camera viewfinders calibrated to these rates. During , conforming involves aligning high-frame-rate clips to a master timeline, such as 24 in software like , where the project settings dictate playback speed and interpolation to integrate slow-motion elements seamlessly without altering the overall narrative pace. Specific implementations, like IMAX's adoption of 48 for select releases after 2016—including sequences in (2022)—leverage dual-rate projection to enhance immersion in large-format screenings while offering fallback to 24 for broader compatibility.

Animation Processes

In traditional animation, playback is standardized at 24 frames per second (fps), but the production process varies based on style and budget. Full animation, exemplified by Disney classics like Snow White and the Seven Dwarfs (1937), typically involves creating 24 unique drawings per second, known as animating "on ones," to achieve fluid, lifelike motion. In contrast, limited animation reduces the drawing count for efficiency; for instance, South Park employs an "on twos" technique, producing 12 unique drawings per second that are held for two frames each during 24 fps playback, resulting in an effective rate of 12 fps. This approach, common in television series, prioritizes expressive poses and dialogue over continuous motion while maintaining cinematic playback speed. In () pipelines, frame rates are configurable to suit project needs, with software like defaulting to 24 fps for film-oriented workflows. Animators set keyframes at this rate for final output, but real-time previews in the often at higher rates, such as 60 fps, to simulate smoother motion and aid iterative adjustments without full computation. Animation Studios has adhered to a 24 fps standard for feature films since (1995), ensuring compatibility with theatrical projection, though internal simulations for elements like cloth or fluids may use higher sub-frame rates—up to several times the base rate—to capture complex dynamics accurately before downsampling to 24 fps. Stylistic choices in frequently manipulate effective frame rates to evoke specific moods or emphasize . Lower rates, such as 8-10 , produce jerky, motion that heightens tension or comedic timing, as seen in select sequences for dramatic impact. Conversely, smooth animations, particularly in , target 30 or above to convey realism and fluidity, aligning with viewer expectations for lifelike movement in contemporary productions. These decisions balance artistic intent with technical constraints, drawing on established practices to enhance narrative expression.

Digital Camera Specifications

Digital cameras' frame rates are primarily constrained by sensor architecture, with and global shutter designs imposing distinct limitations on maximum achievable frames per second (). sensors, prevalent in most consumer and mid-range professional models, expose and read out pixels line by line from top to bottom, leading to potential (known as the jello effect) in fast-moving scenes and capping fps due to sequential readout times that can exceed the exposure interval at high speeds. Global shutter sensors, by contrast, expose and read the entire frame simultaneously, eliminating readout delays and distortion, thereby supporting significantly higher fps without compromising image integrity; for example, the Sony α9 III utilizes a full-frame global shutter to deliver up to 120 fps bursts with full and autoexposure tracking. Resolution trade-offs further dictate frame rate specifications, as higher pixel counts increase data volume and processing demands, often forcing reductions in fps to maintain manageable bandwidth and heat levels. In 2020s cameras, 4K (ultra-high definition) recording at 60 fps has become standard for smooth motion in professional workflows, while 8K resolutions typically limit to 30 fps due to the quadrupling of pixels straining sensor readout and encoding capabilities, as evidenced in the Nikon Z9, which achieves 8K/60p only through optimized internal processing but defaults to lower rates for extended recording. Professional cinema cameras like the ARRI Alexa series exemplify tailored specifications for high-frame-rate needs, with the Alexa 35 supporting up to 120 fps in Super 35 format at full 4.6K resolution, often windowed to 2K for even higher rates in slow-motion applications. Action-oriented devices such as GoPro's HERO13 Black prioritize slow-motion versatility, offering 240 fps at 1080p (full HD) for capturing dynamic sequences like sports or stunts. Consumer smartphones, including the iPhone 16 Pro (as of 2024), reflect advancing capabilities with 4K video up to 120 fps, balancing portability and battery life against computational overhead. Variable frame rate (VFR) support enhances flexibility in digital cameras by permitting non-constant fps within a single recording, which is encoded via standards like H.264 (AVC) and H.265 (HEVC) to accommodate variable bit rates and timestamps. This capability is particularly useful for seamless transitions between normal and slow-motion playback, as implemented in professional camcorders like the AG-CX350, which allows VFR from 1 fps to 60 fps in full HD mode. By 2025, sensor advancements have elevated high-frame-rate options for (VR) and immersive applications, with stacked designs enabling rates like 360 at 900p or 400 at (HD-equivalent) in cameras such as the HERO13 Black, facilitating smoother 360-degree capture and reduced motion artifacts in VR content.

Digital and Interactive Media

Computer Graphics Rendering

In rendering, particularly for non-interactive applications such as (VFX) and simulations, frame rate dictates the of the output sequence, influencing both workflows and final . Rendering pipelines typically involve sequential on CPU or GPU , where each requires processing , , , and . The time to render a single depends on scene complexity, hardware capabilities, and algorithm efficiency, often measured in seconds or minutes per frame for high-fidelity offline rendering. For film VFX, a target frame rate of 24 frames per second () is standard to align with cinematic motion standards, as seen in tools like Houdini where default animation settings are configured at 24 to facilitate seamless integration into 24 projects. Optimization techniques are crucial in offline rendering to adhere to production deadlines, especially when rendering hundreds or thousands of . Adaptive sampling methods dynamically adjust the number of samples per based on local variance, allocating computational resources more efficiently to noisy regions while undersampling uniform areas, thereby reducing overall time without compromising quality. This approach is particularly valuable in Monte Carlo-based , common in VFX pipelines, where it helps meet tight schedules by balancing with temporal constraints across . A key aspect of time estimation involves calculating total render duration, which can be approximated as the product of the frame count and per-frame computation time; for geometry-intensive scenes, per-frame time scales with polygon count divided by GPU throughput (e.g., triangles processed per second). Formally, \text{Total time} = \text{frames} \times \left( \frac{\text{polygons}}{\text{GPU throughput}} \right) This model highlights how upgrades or simplification can accelerate workflows, though actual times vary with additional factors like complexity. Beyond VFX, frame rate standards in extend to outputs. animations typically target 60 for smooth performance on varied devices, while (UI) applications in software aim for 60 to ensure fluid responsiveness and reduce perceived latency. In tools like , the viewport preview operates at up to 60 using simplified rendering (e.g., ) for interactive editing, contrasting with the final offline output at 24 via Cycles for photorealistic sequences, allowing artists to iterate efficiently without waiting for full renders.

Video Games and Real-Time Simulation

In video games and real-time simulations, frame rate targets are established to ensure smooth interactivity and responsiveness, with 60 frames per second (fps) serving as the standard for console gaming, including on the PlayStation 5 (PS5). This benchmark balances visual fidelity and performance, allowing developers to prioritize stable gameplay in performance modes without excessive hardware demands. On personal computers (PCs), higher targets of 120 fps or more are common for users with high-refresh-rate monitors (120–144 Hz or higher), enabling fluid motion in competitive titles and reducing input lag for enhanced player control. Vertical synchronization (V-Sync) is frequently employed to align the game's frame rate with the display's refresh rate, preventing screen tearing by buffering frames, though it may introduce minor latency if the target exceeds the monitor's capabilities. Maintaining consistent frame rates presents significant challenges, particularly frame time variance, which measures inconsistencies in the duration required to each and often results in perceptible stutter even when average remains stable. For instance, spikes in frame time can disrupt smooth motion, making feel choppy despite an overall 60 average. Advanced rendering techniques like ray tracing exacerbate these issues by increasing computational load; at launch in December 2020, Cyberpunk 2077's ray-traced modes on consoles such as the PS5 and Series X targeted 30 in quality modes to accommodate the intensive lighting and reflection calculations, leading to noticeable performance trade-offs compared to non-ray-traced performance modes at 60 . In virtual reality (VR) and augmented reality (AR) simulations, frame rate requirements are more stringent to mitigate motion sickness, with a minimum of 90–120 fps recommended to minimize sensory conflicts between visual cues and vestibular feedback. Studies indicate that 120 fps represents a critical threshold, significantly reducing simulator sickness symptoms like nausea compared to 60 or 90 fps, as higher rates better replicate natural head movement perception. To achieve these targets on varied hardware, adaptive quality scaling dynamically adjusts rendering resolution or graphical details in real-time, scaling down during demanding scenes to maintain frame rates and scaling up when resources allow, thereby optimizing immersion without compromising user comfort. A pivotal advancement in addressing frame rate limitations is NVIDIA's (DLSS), introduced in 2018, which leverages AI-driven upscaling to boost fps while preserving image quality. By rendering at lower internal resolutions and using tensor core-accelerated neural networks to reconstruct higher-resolution frames, DLSS enables ray-traced games to reach 60–120 fps on mid-range hardware, reducing the performance penalty of real-time effects without perceptible quality loss.

Technical Enhancements

Frame Rate Conversion Overview

Frame rate conversion becomes necessary when content captured or produced at one frame rate must be adapted for distribution or playback on systems operating at a different rate, such as transferring frames per second () footage to 29.97 broadcast . This mismatch arises from historical standards where aimed for cinematic motion at , while required higher rates to accommodate interlaced scanning and color subcarrier signals. In modern contexts, real-time up-conversion is common in smart s to match incoming signals, like content, to the display's native , such as 60 Hz, ensuring compatibility without playback interruptions. Basic methods for frame rate conversion include frame duplication for up-conversion and for down-conversion. In up-conversion, such as from 24 fps to 29.97 fps for , techniques like pulldown duplicate frames in a repeating pattern—three frames from the source become two fields, and the next two become three—to fill the target rate without altering playback speed. For down-conversion, involves selectively dropping frames to reduce the rate, preserving by removing redundant or interpolated ones while minimizing motion discontinuity. Poor frame rate conversion can introduce artifacts like judder, a stuttering effect from uneven frame repetition or mismatched timing, particularly noticeable in panning shots. Conversely, matching the native frame rate during playback eliminates such issues, delivering smoother motion and preserving the intended aesthetic, as seen when avoiding pulldown in progressive displays. In practical applications, Blu-ray players often handle 24p output directly for film content, outputting at 24 fps when connected to compatible displays to bypass conversion artifacts. Streaming services like enforce standards such as 23.976 fps for cinematic deliveries, ensuring consistent frame rates across devices and reducing conversion needs during encoding and playback.

Interpolation and Motion Compensation Methods

Frame interpolation techniques form the core of many frame rate up-conversion algorithms, aiming to synthesize intermediate frames between existing ones to increase temporal resolution. Simple linear blending, the most basic method, generates new frames by averaging the pixels of adjacent input frames, such as creating a midpoint frame as I_t = \frac{I_{t-1} + I_{t+1}}{2}, where I_t is the interpolated frame at time t. This approach is computationally efficient but often results in motion blurring and artifacts in scenes with significant movement, as it assumes no pixel displacement between frames. In contrast, estimation provides a more sophisticated by computing dense motion s that describe trajectories across frames, enabling accurate warping and synthesis of intermediate content. methods, such as those in Real-Time Intermediate Flow Estimation (RIFE), estimate bidirectional flows between input frames and use them to generate non-linearly interpolated frames, supporting arbitrary timesteps without relying on pre-trained models. Under a assumption, the displacement for an intermediate frame can be approximated as \vec{d} = \frac{\vec{p}_{next} - \vec{p}_{current}}{\Delta t}, where \vec{p} represents positions and \Delta t is the time interval, allowing pixels to be back-projected to their origins for synthesis. This flow-based approach significantly reduces blurring compared to linear blending, particularly in dynamic sequences. Motion compensation enhances interpolation by explicitly accounting for object displacements through block-matching algorithms, which divide frames into blocks and search for best-matching blocks in reference frames to derive motion vectors. In frame rate up-conversion, block matching, as detailed in early motion-compensated methods, minimizes a cost function like sum of absolute differences (SAD) to find vectors, then compensates by shifting blocks to form interpolated frames, improving accuracy over global averaging. These algorithms are foundational in video processing standards and are often refined with multi-resolution searches to handle varying motion scales. Hybrid methods integrate estimation with synthesis and block-based compensation to address limitations of individual techniques, such as occlusions or complex deformations. For instance, frameworks like those in comprehensive video frame interpolation surveys combine dense fields for fine-grained motion with synthesis modules that blend warped , using techniques like adaptive weighting to resolve conflicts in overlapping regions. This , seen in bidirectional networks, yields smoother interpolations by leveraging for guidance and direct generation for refinement. Practical implementations of these methods appear in , such as TrimensionDN's Digital Natural Motion (DNM) technology in televisions, which employs flow-based estimation to interpolate frames and reduce judder in 24 fps content displayed at 60 fps. More recently, -driven tools like Video AI (as of 2025) utilize neural networks for up-conversion, estimating optical flows and synthesizing frames to achieve 60 fps or higher from lower-rate sources, with capabilities for up to 2000% slow-motion effects while preserving detail. Quality assessment in these methods commonly relies on metrics like (PSNR), which quantifies reconstruction fidelity by comparing interpolated frames to ground-truth high-rate sequences, with higher values (e.g., >30 dB) indicating better preservation of and structure. However, limitations persist, particularly ghosting artifacts in complex scenes involving occlusions or rapid changes, where mismatched motion vectors cause duplicated or faded edges, as observed in perception-oriented studies. These issues highlight the need for advanced handling of non-linear motions in approaches.

High Frame Rate Formats

High frame rate (HFR) formats refer to and display technologies operating at 48 frames per second () or higher, primarily to minimize and enhance clarity in dynamic scenes compared to standard 24 or 30 rates. This approach captures more images per second, allowing for smoother perceived motion, particularly beneficial in fast-paced content where traditional rates can introduce artifacts like strobing or judder. A landmark in HFR adoption was the 2012 theatrical release of The Hobbit: An Unexpected Journey, directed by and filmed at 48 in , which ignited widespread debate on its hyper-realistic "soap opera" aesthetic versus traditional cinematic warmth. Despite mixed reception, the format demonstrated potential for immersive viewing in action-heavy narratives. In sports broadcasting, HFR has been trialed at 120 to capture rapid movements, such as in events to improve live coverage and slow-motion replays of athletics and aquatics. Specialized HFR formats include , which supports 120 fps playback for enhanced content in and video, enabling vibrant colors and fluid motion on compatible devices. In (), standards for headsets like Quest target refresh rates of 90-120 Hz (as of 2025) to prevent and ensure responsive immersion during head-tracked experiences. By 2025, consumer televisions have integrated at 120 fps, as seen in Samsung's Neo QLED models, which leverage mini-LED backlighting for sharp, high-motion visuals in home entertainment. These formats offer advantages like superior in action sequences, reducing visual for viewers of or simulations, but face adoption hurdles due to increased data demands. For instance, transmitting or 8K HFR requires substantial , often necessitating HDMI 2.1 interfaces with up to 48 Gbps throughput to avoid artifacts or issues.

References

  1. [1]
    Frame Rate - an overview | ScienceDirect Topics
    Frame rate is defined as the rate at which one frame of video is replaced by another, measured in frames per second (fps). It influences the visualization ...<|control11|><|separator|>
  2. [2]
    What is frame rate and why does it matter in movie making? - Adobe
    Two significant factors prompted the adoption of 24fps as the industry standard: the advent of sound synchronization and TV broadcasts.What Is A Frame Rate? · What Is Frame Rate? · How Shutter Speed And Frame...
  3. [3]
    Why 24 frames per second is still the gold standard for film
    Aug 17, 2025 · The 24fps standard in filmmaking emerged from early silent films and the need for synchronization with sound in the introduction of "talkies," ...
  4. [4]
    Understanding Standards: Time Code - SMPTE
    Feb 26, 2025 · SMPTE ST 12-3 extends the frame rates to 72, 96, 100 and 120 frames while being backwards compatible with SMPTE ST 12-1 and -2. Drop frame mode ...
  5. [5]
    Frame rate versus Shutter speed - Axis Communications
    Frame rate controls how many images, or frames, are shown each second. If something takes one second to pass in front of a camera, 5 frames per second means you ...
  6. [6]
    Frame Rate vs. Refresh Rate: What's the Difference? - ViewSonic
    Oct 22, 2025 · Frame rate is the rate at which new frames are displayed. It is usually expressed as frames per second, or fps for short.
  7. [7]
    Making Film Music (Part One) - The Cambridge Companion to Film ...
    Jan 20, 2017 · For film running at 24 fps (and thus 1440 frames per minute), this would give Korngold four flashes of light at a tempo of 144 beats per ...
  8. [8]
    Time code for 23.976 / (23.98) frames per second video
    Both 23.976 and 24.0 fps videos use the same type of time code (SMPTE 24). When using 24.0 fps video and SMPTE 24 time code, it syncs with a real time clock.<|control11|><|separator|>
  9. [9]
    Guide to Standard HD Digital Video Measurements - Tektronix
    A video waveform monitor, a specialized form of oscilloscope, is used to measure video amplitude. When setting analog video amplitudes, it is not sufficient to ...
  10. [10]
    Scope-Trigger™: Effortless Video Signal Analysis - Extron
    Scope-Trigger greatly simplifies display of video signals on an oscilloscope to a quick, easy, and painless 1-2-3 process.<|separator|>
  11. [11]
    MediaInfo - MediaArea
    MediaInfo is a convenient unified display of the most relevant technical and tag data for video and audio files.Microsoft Windows · Download MediaInfo for macOS · Download · ScreenshotsMissing: measurement oscilloscopes waveform FFprobe
  12. [12]
    ffprobe Documentation - FFmpeg
    ffprobe gathers information from multimedia streams and prints it in human- and machine-readable fashion. For example it can be used to check the format of the ...
  13. [13]
    Forensic analysis of video file formats - ScienceDirect.com
    This paper explores AVI and MP4-like video streams of mobile phones and digital cameras in detail. We use customized parsers to extract all file format ...
  14. [14]
    [PDF] Speed Analysis From Video: A Method For Determining a Range in ...
    To analyze the metadata and extract information about the timing information between video frames, the program iNPUT-ACE was used, though other techniques ...
  15. [15]
    Finding Frames per Second of a Video File | Baeldung on Linux
    Mar 18, 2024 · FPS can be found using FFmpeg's `ffprobe` with `avg_frame_rate` or `r_frame_rate` or using MediaInfo's video section.
  16. [16]
    FFprobeTips - FFmpeg Wiki
    Dec 17, 2018 · ffprobe is a simple multimedia stream analyzer. You can use it to output all kinds of information about an input including duration, frame rate, frame size, ...
  17. [17]
    The Persistence of Vision - Vision.org
    Whenever light strikes the retina, the brain retains the impression of that light for about a 10th to a 15th of a second (depending on the brightness of the ...
  18. [18]
    Critical Flicker Fusion Frequency: A Narrative Review - PMC - NIH
    Oct 13, 2021 · It is believed that the human eye cannot detect flicker above 50 to 90 Hz and it depends on intensity and contrast, but some reports indicate ...
  19. [19]
    High Frame Rates and Human Vision: A View Through the Window ...
    Aug 9, 2025 · No replica intrudes within the window. But if the eye moves at 2 degrees/sec, the replicas fall within the window and artifacts are visible.
  20. [20]
    Scientists discover speed of visual perception ranges widely in ...
    Apr 2, 2024 · Researchers from Trinity College Dublin have discovered that individuals differ widely in the rate at which they perceive visual signals.
  21. [21]
    [PDF] Purkinje'S Vision: The Dawning of Neuroscience - Monoskop
    The aim of this chapter is to provide an historical context into which Purkinje's (1819/1823a) observations can be placed. The topics are treated in ...
  22. [22]
    Temporal Resolution - Webvision - NCBI Bookshelf - NIH
    May 1, 2005 · The eye can function over a large range of luminance levels; it must also be able to handle the different rates of change in luminance.
  23. [23]
    Temporal vision: measures, mechanisms and meaning
    Jul 30, 2021 · In this Review, I consider how the traditional simple measures CFF and integration time are related to basic retinal mechanisms, the key to ...
  24. [24]
    Visibility of Motion Blur and Strobing Artefacts in Video at 100 ...
    Increasing the frame rate improves the clarity of the image and helps to alleviate many artifacts, such as blur, strobing, flicker, or ...
  25. [25]
    Robert W. Paul – Films and Technology: Part Seven. That annoying ...
    Jan 27, 2021 · A two-bladed shutter increases the flicker frequency from 16 per second ... blade shutter to reduce flicker. From about 1907 the new ...
  26. [26]
    What Is the 180-Degree Shutter Rule — And Why It Matters
    May 30, 2023 · The 180 degree shutter rule states that the shutter speed should be set at double the frame rate to achieve the most natural looking motion.
  27. [27]
    Why the 180-degree shutter rule can, & should, be broken
    Apr 30, 2025 · Learn why the 180-degree shutter rule isn't always necessary and how breaking it can enhance your filmmaking creativity.
  28. [28]
    Shutter Angle In Cinematography Explained - In Depth Cine
    Feb 19, 2023 · Shutter angle controls how much motion blur there is in a single frame. An angle of 360 degrees will create more of a streaky blur when a character moves.
  29. [29]
    Predicting Perception of the Wagon Wheel Illusion | Phys. Rev. Lett.
    Jul 9, 2009 · This wagon wheel illusion, whose history can be traced back to Faraday [1, 2] , results from the aliasing arising from the repetition frame rate ...Missing: effect | Show results with:effect
  30. [30]
    Shutter | National Film and Sound Archive of Australia
    One blade masks the pull-down while the other blade causes an additional light interruption, increasing the flicker frequency to 48 cycles per second on 35mm ...
  31. [31]
    LED Screens - Flicker Sense
    Frame rate control creates flicker at half the refresh rate of the monitor. So if the monitor refresh rate is 60 Hz, the FRC flicker will be at 30 Hz, and ...
  32. [32]
    Evidence that Viewers Prefer Higher Frame-Rate Film
    This higher frame rate theoretically improves the quality of motion portrayed in movies, and helps avoid motion blur, judder, and other undesirable artifacts.
  33. [33]
    Kinetograph | Definition, History, & Facts - Britannica
    Oct 18, 2025 · The Kinetograph could accommodate up to 15 metres (50 feet) of film at a time and could record at a rate of about 40 frames per second. Once ...
  34. [34]
    Thomas Edison's Kinetoscope -- a forerunner of projected motion ...
    Jan 10, 2006 · With most subjects photographed at 40 frames per second of film, and with a vertical looping capacity of roughly 50 feet, the first Kinetoscopes ...Missing: rate | Show results with:rate
  35. [35]
    Projection and Wide Film (1895-1930) - In 70mm
    Mar 15, 2014 · According to Richardson and some other contemporary sources, the filming speed was 20 frames per second, like that of Spoor-Berggren . Again, ...
  36. [36]
    History of film - Edison, Lumiere Bros, Cinematography | Britannica
    Oct 18, 2025 · Their cinématographe, which functioned as a camera and printer as well as a projector, ran at the economical speed of 16 frames per second. It ...
  37. [37]
    [PDF] FILM PRESERVATION FIAF
    Pre-standard silent film era perforations. 05- Round. 06- Edison rectangular type. 07- Pathé rectangular type. 08- Other types. TABLE 14 - PROJECTION FORMATS.
  38. [38]
    The Fi Hall of Fame: Hacking Film - Why 24 Frames Per Second?
    Jul 31, 2023 · Economics dictated shooting closer to the threshold of the illusion, and most silent films were filmed around 16-18 frames per second (fps), ...
  39. [39]
    The Surprisingly Fascinating World of Frame Rates - PremiumBeat
    Sep 30, 2022 · From the early 1900s to the 1920s, there was no industry-standard frame rate; there were no rules. It was simply in the best interests of ...
  40. [40]
    [PDF] History of Sound Motion Pictures by Edward W Kellogg
    Dec 4, 2003 · In 1927 he de- veloped a screen which transmitted sound freely, permitting loudspeakers to be located directly behind the picture. The first ...
  41. [41]
    Technicolor Successive Exposure(c.1935–c.1973) - FILM ATLAS
    After 1955, successive-exposure photography on some films was modified to be compatible with new widescreen formats. Sleeping Beauty (1959), for instance, was ...
  42. [42]
    CinemaScope(1952–1967) - FILM ATLAS
    Aspect Ratio. 2.66:1 / 2.55:1 / 2.35:1. No. projected film strips. Frame advancement. 4-perforation / Vertical. Frame rate. 24 fps. ×. CinemaScope. CinemaScope ...
  43. [43]
    Why Are Movies 24 FPS — The Cinematic Frame Rate Explained
    Sep 17, 2023 · The choice of 24 FPS wasn't arbitrary; it was born out of necessity. In the early days of sound movies, around the late 1920s, the industry ...
  44. [44]
    The Reason Why Many Found The Hobbit At 48 FPS An ... - Forbes
    Jan 11, 2013 · Many found the lighting shrill and their experience of the film compromised by the very technology that was supposed to make it “more immersive.”
  45. [45]
    Expert Guide to the Best Frame Rate for Slow Motion
    Feb 28, 2025 · 120 FPS should be the best frame rate for recording slightly faster-moving subjects and action scenes. For instance, Ang Lee used 120fps ...
  46. [46]
    Mixing Frame Rates in DaVinci Resolve - Part 1: Know Thy Frame rate
    Sep 30, 2019 · This article will walk you through how to work with frame rates in DaVinci Resolve. Specifically, this article is for experienced users.
  47. [47]
    How to Change Frame Rate in DaVinci Resolve (Tutorial)
    May 14, 2024 · Set the Timeline Frame Rate to 23.976 fps (or whatever your chosen frame rate is). Remember, once you import any media, you cannot change it.
  48. [48]
    'Avatar: The Way of Water' is the first great high frame rate movie
    Dec 13, 2022 · Certain scenes play back at 48 frames per second, giving them a smoother and more realistic sheen compared to the standard 24fps. That leads ...
  49. [49]
    Fan Question: Do you hand draw every scene? | News - South Park
    Mar 10, 2016 · ... frames (we animate at 24 frames per second), just like most other animated shows. Pretty cool, huh? Subscribe for South Park announcements ...Missing: limited | Show results with:limited
  50. [50]
    What Does Animating on Ones, Twos & Threes Mean?
    Aug 16, 2021 · Animating on 2s means that for each second of animation, there are 12 new drawings or “frames”. This is the most common type of animation. This ...
  51. [51]
    Autodesk Maya - Character Animation Part 1 - STYLY
    Dec 11, 2018 · 'Time' has been set to 'Film 24fps' by default. You can change the ... When the frame rate is changed, the Time Slider will adapt to it.<|separator|>
  52. [52]
    Did your friend say what frame rate Pixar uses for its animated films?
    Last year, I was told by one of the technical managers for animation at Pixar that they're all in 24 fps. They've used all the increased processing power ...
  53. [53]
    What framerate was Wall-E created in? : r/Pixar - Reddit
    Sep 25, 2020 · Certain sequences may have higher frame rates but 24fps will be the minimum. They stated as much in the making of Toy Story and Toy Story 3 ...Simple calculation: If we want Pixar-level graphics in our games we ...So why does it take so much computing power to render Pixar movies?More results from www.reddit.comMissing: simulations | Show results with:simulations
  54. [54]
    The Impact of Frame Rate on Different Animation Styles (24fps vs ...
    Lower effective frame rates (e.g., animating "on twos" at 12fps) create a stylized, snappy motion. This is a deliberate creative choice in anime and films like ...
  55. [55]
    Electronic shutter, rolling shutter and flash: what you need to know
    May 22, 2017 · The rolling shutter effect is usually seen as a damaging defect but even this can be used creatively, with enough imagination.
  56. [56]
    Sony announces a9 III: World's first full-frame global shutter camera
    Nov 7, 2023 · The impressive global shutter full-frame image sensor enables the camera to shoot at burst speeds up to 120 frames per second with no rolling ...
  57. [57]
    Best 8K and 6K camera in 2025: High-resolution video cameras are ...
    Aug 27, 2025 · The Nikon Z9 is a powerhouse for high-resolution video, offering internal 8K/60p and 4K/120p recording with no crop or time limits - making it a ...
  58. [58]
    [PDF] ALEXA 35 brochure - ARRI
    Sensor Photosites and Size. 4608 x 3164. Ø 33.96 mm / 1.337". Sensor Frame Rates. 0.75 - 120 fps. Project Frame Rates. 23.976, 24, 25, 29.97, 30, 47.952, 48, 50 ...<|separator|>
  59. [59]
  60. [60]
    iPhone 15 and iPhone 15 Plus - Technical Specifications - Apple (AM)
    4.7 15K · Free delivery · Free 14-day returnsiPhone 15 Technical Specifications · 4K Dolby Vision video recording at 24 fps, 25 fps, 30 fps, or 60 fps · 1080p HD video recording at 25 fps, 30 fps, or 60 fps ...
  61. [61]
  62. [62]
    Global Animation Options - SideFX
    Frames per second sets the rate for animation playback when realtime playback is turned on. A good rule of thumb is to set the FPS for your animation once ...
  63. [63]
    [PDF] Recent Advances in Adaptive Sampling and Reconstruction for ...
    Adaptive sampling controls sampling density based on variance, while adaptive reconstruction uses local filters to reduce variance by aggregating samples.
  64. [64]
    3D Render Time Calculator
    The 3D render calculator estimates rendering time using the formula: total render time = time per frame × frames / machines. It can also calculate how many ...
  65. [65]
    Polygon count, shaders and render times
    Feb 25, 2015 · My conclusion would be that shaders are multiplicative in terms of render times as it does calculations on each polygon.
  66. [66]
    Animation performance and frame rate - MDN Web Docs - Mozilla
    Responsive user interfaces have a frame rate of 60 frames per second (fps). While it is not always possible to maintain 60fps, it is important to maintain a ...
  67. [67]
    60 FPS: Performant web animations for optimal UX - Algolia Blog
    May 24, 2022 · You can use the frame rate of an animation to measure how responsive an application feels. 60 frames per second is the generally accepted target for achieving ...
  68. [68]
  69. [69]
    29.97 Things You Didn't Know About Frame Rates
    Apr 20, 2021 · Frame rate: the number of frames (consecutive still images) in one second of video. With the addition of sound synchronization and widespread ...
  70. [70]
    Why A High Frame Rate TV Can't Fix Cinematic Motion - RTINGS.com
    Oct 21, 2025 · Judder is often seen in conjunction with stutter because judder is mainly seen in 24 fps and 25 fps content, which invariably presents stutter.
  71. [71]
    What Is 3:2 Pulldown? - Extron
    This process of matching the film frame rate to the video frame rate is called 3:2 pulldown. As you recall, NTSC video uses interlaced scanning.
  72. [72]
    Working with frame rate conversion - MediaConvert
    Framerate conversion adjusts video quality for different standards, enabling interpolation for smoother motion or disabling for sharper frames. Codec settings ...Missing: real- smart
  73. [73]
    Native Frame Rate Playback - Netflix TechBlog
    Jun 5, 2023 · This is possible through an innovative technology called Quick Media Switching (QMS) which is an extension of Variable Refresh Rate (VRR) ...
  74. [74]
    What is the difference between 1080/24p and 1080/60p output ...
    May 25, 2022 · The designations of 24 and 60 indicate the frame rate the player is outputting to the television. Even though Blu-ray Disc® movies are encoded ...
  75. [75]
    VFX Media Review - Delivery Specifications
    Frame Rates. 23.976, 24, 25, 29.97, 30, 50, 59.94, 60. Color Space. Rec. 709. Audio Codec (H.264). HE-AACv1, HE-AACv2, AAC-LC (44.1kHz or 48kHz Sample Rate).
  76. [76]
    Real-Time Intermediate Flow Estimation for Video Frame Interpolation
    Nov 12, 2020 · RIFE does not rely on pre-trained optical flow models and can support arbitrary-timestep frame interpolation with the temporal encoding input.
  77. [77]
  78. [78]
  79. [79]
    Smooth Operator | Sound & Vision
    To eliminate judder, Trimension calculates enough interpolated frames between the actual frames recorded on a DVD to be able to show them all at 60 fps. Good ...
  80. [80]
  81. [81]
    Frame rate up-conversion in cardiac ultrasound - ScienceDirect.com
    This can be explained by the fact that PSNR metric provides an error measure based on point wise intensity deviation and does not take structural degradation ...Frame Rate Up-Conversion In... · 2. Materials And Methods · 4. Discussions
  82. [82]
    High Frame Rate Video Playback - RED cameras
    This article explores the influence of high frame rate (HFR) video playback, along with the associated motivations and controversy.
  83. [83]
    What is HFR cinema? - Beverly Boy Productions
    Oct 3, 2025 · When you watch a movie filmed in HFR, motion blur is greatly reduced, making it easier to follow every element on screen—even during the most ...
  84. [84]
    Will "The Hobbit" Start A 48 FPS Movie Revolution? - Fast Company
    Dec 13, 2012 · The Hobbit introduces high frame rate moviemaking to cinemagoers after 85 years of 24 fps. Detractors say it ruins the cinema fantasy.
  85. [85]
    Are we past 'Peak 4K'? - FlatpanelsHD
    Nov 4, 2024 · Trials with 4K have first been undertaken during the 2014 Winter Olympics. At the 2016 Olympics in Rio, 4K coverage was limited to the opening ...
  86. [86]
    Dolby Vision HDR: everything you need to know - What Hi-Fi?
    Sep 16, 2024 · The iPhone 16 Pro models can capture footage in Dolby Vision up to 4K at 120fps, as well. ... Most TVs would usually switch to a Dolby Vision HDR ...
  87. [87]
    Guidelines for VR Performance Optimization - Meta for Developers
    While ASW will help smooth out a few dropped frames now and then, applications must meet a consistent 90 frames per second (FPS) on a recommended spec machine ...
  88. [88]
    The Best 8k TV of 2025 - RTINGS.com
    Rating 7/10 · Review by Ben TrudeauAug 19, 2025 · The TV has four HDMI 2.1 ports capable of 8k @ 120Hz ... We replaced the Samsung QN900D 8K with the Samsung QN990F 8K in the 'Best 8k TV' category ...
  89. [89]
    HDMI 2.1 Explained – Everything You Need to Know - ViewSonic
    Oct 29, 2025 · HDMI 2.1's larger bandwidth acts as a stronger bridge that supports heavier content loads from game consoles and PCs. Higher Resolutions and ...
  90. [90]
    What Is HDMI 2.1?: An Overview - RTINGS.com
    Jan 9, 2025 · The higher bandwidth of HDMI 2.1 requires more advanced processors on the display, so it's unlikely that any existing display will be able to ...