Fact-checked by Grok 2 weeks ago

Depth of focus

Depth of focus refers to the range of distances along the in image space within which the image of an object remains acceptably sharp, representing the tolerance for displacing the (such as a or ) from its nominal position before the resulting defocus blur exceeds an acceptable limit, typically defined by the circle of confusion. This concept is fundamental in , particularly in systems where precise alignment of the is critical, such as in , , and imaging devices. Unlike , which describes the corresponding range of object distances in object space that appear acceptably sharp when imaged onto a fixed , depth of focus operates in the conjugate image space and is scaled by the square of the system's . For instance, in a high-magnification system, the depth of focus is larger relative to the depth of field, while in low-magnification setups like , the image-space tolerance is smaller despite a deeper object-space depth. This distinction arises from the of optical conjugates, where defocus in object space is magnified in image space, affecting the allowable . The magnitude of depth of focus is influenced by several key parameters, including the wavelength of light (λ), the (or F/#) of the optical system, and the acceptable blur size. A common approximation for the depth of focus is Δz ≈ 2λ (F/#)^2, indicating that it increases quadratically with larger f-numbers (smaller apertures) and longer wavelengths, allowing greater positional tolerance in low-resolution or systems. In , where (NA) plays a prominent role, depth of focus also varies with magnification; for example, a 4× with NA 0.10 may yield an image depth of 0.13 mm, while a 100× with NA 0.95 can extend to 80 mm, facilitating precise sensor alignment despite shallow . Applications of depth of focus are widespread in precision . In and , it determines the mechanical tolerance for lens-sensor alignment, ensuring consistent image quality across production variations. In , particularly with intraocular lenses, it quantifies the eye's tolerance to defocus, influencing designs for extended depth of focus implants that enhance over a range of distances without . Overall, understanding depth of focus enables optimization of optical systems for , , and manufacturing feasibility.

Definitions and Concepts

Basic Definition

Depth of focus refers to the axial range in the image space over which an image remains acceptably sharp when the or film plane is displaced from the precise focal plane. This tolerance allows for minor variations in the positioning of the image detector without significantly degrading image quality, as the blur caused by such displacement stays within an acceptable limit defined by the circle of confusion. Unlike concepts measured in object space, depth of focus specifically quantifies the permissible shift in the lens-to-sensor distance, which is generally small—on the order of millimeters—and contrasts with the much larger distances involved in object positioning. This image-space metric is crucial for practical systems where exact alignment of components may be challenging, providing a buffer against mechanical imperfections or vibrations. Visually, depth of focus can be understood through the of rays: a focuses rays from an object point into a converging that meets at the focal before diverging; the depth of focus represents the longitudinal segment along the where the cross-section of this cone remains smaller than the allowable blur diameter, often depicted as a diamond-shaped tolerance zone bounded by rays from the edges to the edges of the circle of confusion. This concept in image space corresponds to depth of field in object space, where the latter describes the range of object distances yielding sharp images for a fixed sensor position.

Comparison with Depth of Field

Depth of field (DOF) is defined as the range of distances in object space—typically in front of the lens—over which objects appear acceptably sharp when projected onto a fixed image plane, such as a camera sensor. In contrast, depth of focus (DOFoc) describes the range of positions for the image plane itself, where a stationary object in the scene maintains acceptable sharpness, allowing for variations in sensor placement or orientation. This distinction positions DOF as a property of the subject-to-lens relationship and DOFoc as a characteristic of the lens-to-image tolerance. The two concepts share a reciprocal relationship: a shallow DOF in object space, which limits the sharpness range for subjects at varying distances, corresponds to a deeper DOFoc in image space, providing greater leeway for adjustments, and conversely for deeper DOF scenarios; this interplay arises from the inherent in the optical system, scaling distances between object and image spaces. Beginners often conflate DOF and DOFoc due to their similar , mistakenly believing that DOFoc influences the direct sharpness of scene elements like backgrounds or subjects, when it instead governs the precision required in aligning the components behind the lens. Qualitative examples illustrate these differences clearly. In , a shallow DOF enables the subject's face to remain sharply rendered while the background blurs into a soft, non-distracting , emphasizing the foreground element within object space. By comparison, DOFoc comes into play during camera manufacturing or setup, where it determines the allowable misalignment of the —such as slight tilts or shifts—without compromising overall image clarity for a fixed subject.

Influencing Factors

Optical Parameters

The size of a , quantified by its , is a key determinant of depth of focus. A smaller , which corresponds to a larger diameter, reduces the depth of focus by creating a narrower bundle of light rays converging on the ; this limits the allowable displacement of the before the resulting defocus exceeds the tolerance. Conversely, increasing the (stopping down the ) widens the depth of focus, as the broader ray bundle permits greater positional tolerance without significant degradation in . Wavelength of light plays a critical role in defining depth of focus through diffraction effects. Shorter wavelengths, such as those in blue light (around 470 nm), yield a smaller depth of focus because they result in tighter diffraction-limited spot sizes, imposing stricter limits on blur circle growth from defocus. In contrast, longer wavelengths allow for a more extended depth of focus by relaxing these diffraction constraints. Lens aberrations further modulate depth of focus by introducing deviations from ideal ray convergence. Chromatic aberration, which varies with wavelength, causes different colors to focus at slightly offset planes, asymmetrically narrowing the effective depth of focus and potentially shifting the best focus position. Spherical aberration, meanwhile, affects marginal rays more severely, leading to a curved focal surface that reduces symmetry and tolerance around the nominal focus; simple single-element lenses exhibit pronounced spherical aberration, resulting in a more restricted depth of focus compared to compound lenses designed to minimize such errors through multiple elements.

System and Environmental Factors

In imaging systems, the of the or plays a critical role in determining the effective depth of focus by influencing the acceptable size of the , which defines the threshold for perceptible blur. Higher sensors, featuring smaller sizes (typically around 8 μm in high-end digital single-lens reflex cameras), necessitate a correspondingly smaller —often on the order of the pitch—to preserve sharpness across the , thereby reducing the tolerance for axial shifts and compressing the overall depth of focus. System magnification further modulates depth of focus, with higher magnification levels—common in telephoto or configurations—extending the axial range over which the image remains acceptably sharp due to the quadratic relationship between depth of focus and magnification. This arises from the longitudinal magnification of the optical system, where increased amplifies the object-space depth into a larger image-space tolerance, with depth of focus scaled by the square of the magnification factor. For instance, in evaluations, this scaling accounts for the extended tolerance in image space. Temperature variations and mechanical stability introduce additional constraints on usable depth of focus by inducing shifts in the through and vibrations. in mounts and housings can displace the focal by up to ±80 μm over ranges from -40°C to +85°C, effectively narrowing the operational depth of focus unless compensated by athermalization techniques such as or mechanical adjustments. Similarly, environmental vibrations, prevalent in settings, can cause axial drifts exceeding 5 μm over hours, pushing the image beyond the depth of focus and degrading ; active stabilization systems, achieving ~21 precision, are often required to maintain the light sheet or within this tolerance during extended acquisitions. Illumination conditions indirectly affect depth of focus assessment by altering the perception of through reduction and . Non-uniform lighting or introduces veiling , which spreads across the and exacerbates the of defocus-induced , making marginally out-of-focus regions appear softer or hazier even within the nominal depth of focus. This effect is quantified in response metrics, where reduces modulation transfer and perceived sharpness, particularly in systems with shallow depth of focus.

Mathematical Formulation

Core Equations

The depth of focus, denoted as DOFoc, quantifies the axial range in image space over which the can be positioned while maintaining acceptable sharpness, typically defined by the blur circle not exceeding the circle of confusion diameter c. In the geometric approximation for distant objects (infinite conjugates), the primary is given by \text{DOFoc} \approx 2 N c, where N is the of the (also known as the relative , N = f / D with f the and D the diameter) and c represents the maximum allowable blur diameter in the , often set to the size or a thereof based on requirements. This expression arises from paraxial ray tracing considerations of defocus blur formation. Consider an ideal thin lens focusing parallel rays from a distant point source onto the image plane at distance v \approx f. A marginal ray parallel to the optical axis passes through the edge of the aperture and intersects the focal plane at height D/2. If the image plane is displaced axially by \delta z toward the lens, the intersection of this ray with the displaced plane forms a blur circle. The diameter b of this blur circle is b = \delta z \cdot (D / v). Since N \approx v / D for infinite conjugates, b = \delta z / N. Setting the acceptable blur b = c yields the one-sided tolerance \delta z = c N; accounting for symmetric defocus on either side of the nominal focus gives the total depth of focus \text{DOFoc} = 2 c N. This derivation assumes paraxial rays (small angles relative to the optical axis) and neglects higher-order aberrations. For finite conjugate systems involving magnification m (where m = v / u with u the object distance), the formula extends to account for the increased image distance v = f (1 + m) and the corresponding working f-number N_w = N (1 + m), yielding \text{DOFoc} = 2 N c (1 + m). Here, the blur circle scaling incorporates the longitudinal effect in image space, where the ray cone angle adjusts with magnification, increasing the depth of focus relative to the infinite case for m > 0. This adjustment follows from the paraxial equation and ray heights traced through the aperture stop. These formulations rely on key assumptions: an ideal model without aberrations, monochromatic illumination to ignore chromatic effects, paraxial for propagation, and diffraction-limited conditions where geometric blur dominates over wave optics effects (valid for c \gg \lambda N, with \lambda the ). The N and m directly influence the as discussed in optical parameters.

Calculation Methods

To compute the depth of focus (DOFoc) in practical systems, the process begins by referencing the core DOFoc ≈ 2 N c, where N is the and c is the allowable circle in the . First, select c based on the 's ; for sensors, c is typically set to half the pixel pitch to ensure the does not exceed the Nyquist limit, such as c = 1.725 μm for a sensor with 3.45 μm pixels. Next, determine N from the aperture diameter D and f via N = f / D; for example, an f/4 has N = 4. Then, apply the core formula to obtain the nominal DOFoc. Finally, adjust for m if at close distances, using the extended form DOFoc ≈ 2 N c (1 + m) to account for the increased tolerance in the due to extension or setups. For complex systems involving varying object distances, approximations integrate the H = f² / (N c) to estimate effective DOFoc across a range; this scales the computation by treating distant objects as contributing minimal defocus while adjusting c dynamically for near-field variations. In tilted configurations, such as those using the to align the with the object plane's tilt, the DOFoc is briefly accounted for by rotating the or , which extends the effective range but requires iterative adjustment of the tilt angle to maintain across the field. Beyond analytical methods, ray-tracing software like enables precise simulation of DOFoc by modeling ray bundles through the optical system, incorporating aberrations and defocus to compute blur spots iteratively for non-ideal lenses. Assuming perfect in calculations overestimates DOFoc, as manufacturing tolerances in camera assembly—such as sensor tilt of ±0.1 mm or back shifts—can reduce the effective range by up to 20-30% in high-resolution systems, necessitating active during production to meet performance specs.

Practical Applications

In Imaging Systems

In imaging systems such as digital single-lens reflex (DSLR) and mirrorless cameras, depth of focus (DOFoc) determines the allowable range for positioning relative to the , ensuring acceptable image sharpness across manufacturing tolerances. The —the fixed mechanical distance from the mount to the —must be maintained with precision on the order of hundredths of a millimeter, as even minor deviations can introduce exceeding the system's criterion. DOFoc, approximated as ±B' × f/# (where B' is the maximum tolerable diameter and f/# the lens f-number), guides these tolerances by quantifying how much the can shift longitudinally from the nominal without degrading focus. For instance, in wide-aperture lenses (low f/#), DOFoc is shallower, demanding tighter alignment in camera assembly to prevent consistent defocus across the frame. DOFoc also imposes fundamental limits on autofocus (AF) precision, particularly distinguishing (PDAF) in DSLRs from contrast-detection AF (CDAF) in many mirrorless systems. In PDAF, which uses a dedicated to split incoming light and measure differences, focus accuracy is typically calibrated to within one full DOFoc at the lens's maximum for standard points, or one-third DOFoc for high-precision central points (e.g., requiring f/2.8 or faster lenses on professional bodies like the series). This tolerance is essential for reliability, as PDAF's effective high (e.g., f/22–f/32 equivalent) yields a deeper DOFoc on the AF , reducing sensitivity to minor errors but still bounding overall precision. In contrast, CDAF relies on the main to analyze contrast gradients, achieving potentially higher accuracy within a narrower DOFoc but at slower speeds due to iterative hunting. For high-speed applications like , PDAF's faster acquisition—often tracking subjects at rates exceeding 10 —leverages these DOFoc limits to maintain on erratic motion, such as a soccer player sprinting at 30 km/h, where CDAF might lag and miss peak action. Lens mounting standards further highlight DOFoc's role in system compatibility, with variations in flange focal distance influencing adapter feasibility and focus tolerances. The Canon EF mount, with a 44 mm flange distance, supports broader interoperability than the Nikon F mount's 46.5 mm, as the shorter distance allows simple, optics-free adapters for lenses from longer-flange systems (e.g., adapting or medium-format optics to Canon bodies) while preserving within DOFoc bounds. Conversely, adapting shorter-flange lenses like to requires corrective optics or impossible negative-thickness spacers, as the 2.5 mm excess would shift the beyond typical DOFoc tolerances (e.g., 0.01 mm errors preventing sharp ). These differences affect professional workflows, where adapters must maintain sub-DOFoc precision to avoid chronic back- or front-focus issues in video rigs or hybrid setups. Post-processing software, such as , can enhance perceived sharpness through sharpening, effectively extending the usable DOFoc by amplifying captured and mitigating minor defocus artifacts. However, this digital adjustment does not modify the underlying optical tolerance, as it cannot reconstruct details entirely absent from the due to shifts exceeding the true DOFoc—severe from misalignment remains irrecoverable, limited to enhancing what the and initially captured. Unlike , which governs object-space sharpness for creative control, DOFoc in these systems prioritizes mechanical and algorithmic reliability for consistent output.

In Scientific and Industrial Contexts

In , particularly at high magnifications, the depth of focus (DOFoc) becomes extremely limited, often less than 1 μm for 100× oil-immersion objectives with numerical apertures around 1.25–1.30, necessitating precise to maintain sharp of fine cellular structures. This shallow DOFoc arises from the high required for resolving details as small as 0.25–0.27 μm, making even minor thermal drifts—such as a 1°C change causing 0.5–1.0 μm focal shifts—a significant challenge in live-cell . To address this limitation for three-dimensional samples, z-stack techniques capture multiple focal planes and computationally combine them into an extended-focus image, enabling comprehensive volumetric analysis without mechanical refocusing during acquisition. In for automated , DOFoc tolerances are critical for maintaining consistent across varying object positions, such as on moving conveyor belts in lines. For instance, in wafer scanning, wafer warpage can exceed 100 μm across a single die, far surpassing the typical DOFoc of conventional optical systems, which leads to defocus and unreliable defect detection at nanometer scales. Systems mitigate this by employing , where multiple images at different focal depths are merged to extend the effective DOFoc, ensuring all surface features remain sharp during high-speed scanning of wafers for voids or errors. This approach enhances precision in dynamic environments, reducing false positives in processes. Recent advancements in ophthalmic and medical imaging, particularly as of 2024, highlight distinctions in DOFoc for retinal imaging, where the eye's accommodation dynamically influences the perceived focus range on the retina. In adaptive optics (AO) ophthalmoscopy, the inherently small DOFoc—often limited to specific retinal layers like the inner plexiform layer—requires ultrafast corrections to counteract accommodation-induced fluctuations, which can shift focus between layers such as the inner nuclear and nerve fiber layers during steady-state viewing. Ocular accommodation adjusts the lens to refocus the retinal image, affecting axial intensity distributions and necessitating non-cycloplegic stabilization techniques for accurate layer-specific imaging in non-paralyzed eyes. These considerations enable finer control, with focus steps as precise as 0.02 diopters (approximately 7.4 μm), improving diagnostic resolution for conditions involving retinal depth variations. Industrial applications leverage wavefront coding to artificially extend DOFoc through computational , minimizing the need for high-precision adjustments in systems. This employs a mask, such as a cubic element, at the pupil plane to render the point-spread function insensitive to defocus, achieving near-diffraction-limited performance over depths up to 30 times greater than conventional limits. By combining this optical preprocessing with digital , systems reduce sensitivity to misalignment in tools, such as those used for precision inspection in , thereby lowering costs associated with hardware. Pioneered in seminal work on incoherent , wavefront coding has been integrated into infinity-corrected microscopes and for robust performance across varied working distances.

References

  1. [1]
    Depth of field - Stanford Computer Graphics Laboratory
    Mar 1, 2012 · The width of this diamond, which is really a distance along the optical axis, is called the depth of focus for this optical arrangement.
  2. [2]
    Anatomy of the Microscope - Depth of Field and Image Depth
    Nov 13, 2015 · The term depth of focus, which refers to image space, is often used interchangeably with depth of field, which refers to object space. This ...
  3. [3]
    [PDF] Section 17 Camera Systems
    The depth of focus DOF describes the amount the detector can be shifted from the nominal image position for a given position before the resulting blur exceeds ...
  4. [4]
    Depth of field or depth of focus? - PMC - PubMed Central
    Nov 19, 2024 · “DOFi” is defined as the range of object distances within which objects are imaged with acceptable sharpness. Thus, DOFi is the image quality of ...
  5. [5]
    [PDF] 2) First order optics
    Depth of focus : ∆z = ±2 λ (F#). 2. MTF cutoff : fc = 1/(λ F#) θ f. D u=sinθ. Page 18. J. Burge. University of Arizona. 18. Positive lens m > 1. Page 19 ...
  6. [6]
    [PDF] Optics Formulas
    D = collimated beam diameter or diameter illuminated on lens. Depth of Focus (DOF). DOF = (8λ/π)(f/#)2. Only if DOF < ...
  7. [7]
    Depth of Field and Depth of Focus
    ### Summary of Depth of Focus from Edmund Optics
  8. [8]
    When did the depth of field become more commonly used in ... - Quora
    Nov 7, 2019 · Controlling the depth of field for aesthetic purposes has been part of photography since the advent of lenses. “Depth of Focus”, for example, ...
  9. [9]
    Depth of Field and Depth of Focus | Nikon's MicroscopyU
    The depth of field is the thickness of the specimen that is acceptably sharp at a given focus level. In contrast, depth of focus refers to the range over ...
  10. [10]
    Depth of Field - Everything You Need To Know - NFI
    Portrait photography uses a shallow depth of field to draw focus on the subject. A wide Depth of Field in landscape photographs will include everything in focus ...
  11. [11]
    Impact of Wavelength and Spot Size on Laser Depth of Focus - NIH
    Jan 22, 2025 · Either a longer wavelength or higher magnification optics (or both) shortens the depth of focus, which makes a method less tolerant to surface ...
  12. [12]
    Spherical aberration of an optical system and its influence on depth ...
    This paper analyzes the influence of spherical aberration on the depth of focus of symmetrical optical systems for imaging of axial points.
  13. [13]
    Depth-of-Focus and its Association with the Spherical Aberration ...
    To investigate the relationship between the sign of spherical aberration (SA) and the corresponding depth-of-focus (DoF) values around best focus.
  14. [14]
    [PDF] Defocus & Depth of field - MIT
    If the starting point is so far out of focus that the sensor can't identify a phase difference, the camera racks the lens once forward and once backward to find ...
  15. [15]
    [PDF] Design of a Miniature Camera System for Interior Vision Automotive ...
    Aug 18, 2023 · Depth of Focus (DOF) ... Dynamic Range is an important factor when selecting an image sensor as well.
  16. [16]
    [PDF] Athermalization Techniques in Infrared System
    From aberration theory, the depth of focus for a diffraction limited imaging system ( /4) is given by. ∆. 2 /#. Where. F/# = f/D (focal length/clear aperture ...
  17. [17]
    Active remote focus stabilization in oblique plane microscopy - PMC
    Our method achieves ∼21 nm axial precision and maintains the light-sheet well within the depth of focus of the detection system for hour-long acquisition runs.
  18. [18]
    [PDF] Digital Imaging Framework - Part I
    Depth of Focus. Spatial SFR Uniformity. (determin istic). Random. (sto ch astic) ... - Circle of confusion. - Focus tolerance. - Hyperfocal distance. - Over ...
  19. [19]
    [PDF] Ray Optics for Imaging Systems Course Notes for IMGS-321 11 ...
    Dec 11, 2013 · 3.1 Paraxial Ray Tracing Equations . ... We can also derive the depth of focus by finding the range of image locations that satisfy Rayleigh's.
  20. [20]
    [PDF] Some Thoughts on View Camera Calculations
    Feb 9, 2003 · formula governing what is called depth of focus is. 2d = 2 v f. Nc = 2(1 + M)Nc or just 2Nc for distant subjects. This is the same formula we ...
  21. [21]
    Unable to Focus Accurately with Wide Angle Lenses [Archive]
    Nov 30, 2004 · Let me expand a bit on one point Michael made. The formula for depth of focus is 2 N c (1 + M) where N is the f-number, and ...
  22. [22]
    Depth-of-Focus and its Association with the Spherical Aberration ...
    Methods. We modeled schematic eyes having a range of SA values, C (4, 0), from -0.20 to 0.20 μm, at 6 mm pupil, in a ray-tracing software (Zemax).
  23. [23]
  24. [24]
    Canon EOS DSLR Autofocus Explained - The-Digital-Picture.com
    For EOS AF points, there are two levels of precision - 'normal' which is within one depth of focus for the attached lens at max aperture, and 'high precision' ...
  25. [25]
    Autofocus System Design by Marianne Oelund
    This thread will present the stepwise development of a phase-detect autofocus system, using basic optical concepts and ray diagrams. ... depth of focus for the AF ...
  26. [26]
    Understanding Flange Focal Distance
    ### Summary: Depth of Focus and Lens Mount Compatibility for Canon EF and Nikon F
  27. [27]
    Advanced Post-Processing Tips: Three-Step Sharpening
    Aug 2, 2023 · This article covers three-step sharpening in a way that is easy to implement in your own photos, regardless of your specific software.
  28. [28]
  29. [29]
    Correcting Focus Drift in Live-Cell Microscopy
    In general, focus drift is more of a problem when using high magnification and numerical aperture oil immersion objectives (having a very shallow depth of focus) ...
  30. [30]
    How to Use Z-Stacking Microscopy Software - Microscope World
    Feb 13, 2020 · Microscopy z-stacking software (also known as Extended Depth of Focus or EDF software) allows images to be captured under the microscope ...<|separator|>
  31. [31]
    How Advanced Packaging Is Reshaping Inspection
    Jul 10, 2025 · But even with careful fixturing, wafer warpage can exceed 100 microns across a single die, which far surpasses the depth of focus range in most ...
  32. [32]
    Integrated Focus Stacking Solution Delivering Extended Depth of ...
    Focus stacking extends the depth of field by capturing multiple images at different focal positions and combining their sharpest regions into a single image ...
  33. [33]
    Ultrafast adaptive optics for imaging the living human eye - Nature
    Nov 29, 2024 · The focus of the eye is known to fluctuate during steady-state accommodation. Such fluctuation makes it difficult to image a specific retinal ...
  34. [34]
    [PDF] Extended depth of field through wave-front coding
    Häusler, ''A method to increase the depth of focus by two step image processing,'' Opt. Commun. 6, 38–42 119722. 7. W. T. Cathey, B. R. Frieden, W. T. Rhodes, ...