Fact-checked by Grok 2 weeks ago

Ptychography

Ptychography is a technique that reconstructs the complex-valued function (amplitude and ) of a specimen from a series of far-field intensity patterns, obtained by scanning a localized coherent illumination probe across overlapping regions of the sample and applying iterative algorithms. The technique addresses the problem in by exploiting redundancy from the overlapping probe positions, enabling wavelength-limited resolution without relying on physical lenses, which are prone to aberrations. Originating from proposals in electron crystallography by Walter Hoppe in 1969, ptychography was theoretically formalized for electron microscopy by Rodenburg and Bates in 1992 and for X-rays by Chapman in 1996, with practical implementations emerging in the early 2000s through algorithms like the difference map and ptychographic iterative engine (PIE) developed by Faulkner and Rodenburg in 2004. In , key advancements include the first experimental demonstrations in 2007 and the introduction of Fourier ptychography in , which extends the method to wide-field imaging by angularly varying the illumination. Ptychography's advantages include superior dose efficiency for radiation-sensitive samples, quantitative for transparent specimens like biological cells, and applicability across wavelengths—from visible to X-rays and —allowing integration with sources, microscopes, and standard optical setups. It overcomes traditional trade-offs between resolution and field-of-view, achieving sub-micron to atomic-scale details over millimeter-scale areas, and requires no reference beam or , making it ideal for live . Notable applications span , where it enables 3D of nanostructures at facilities; electron microscopy for atomic-resolution imaging of defects in crystals; and , including quantitative phase imaging of cells for drug screening, , blood analysis, and high-throughput detection of rare circulating tumor cells. Recent developments, such as variants and integration with for faster reconstructions, continue to expand its utility in real-time and large-scale imaging tasks.

Fundamentals

Definition and Basic Principles

Ptychography is a lensless computational technique that reconstructs the complex-valued transmission function—encompassing both amplitude and phase—of a specimen from a series of overlapping intensity patterns. It operates without physical lenses by leveraging scanned coherent illumination to probe the sample, enabling high-resolution imaging beyond traditional optical limits. This method addresses the limitations of direct imaging by computationally inverting far-field data collected across multiple overlapping regions. The basic principles of ptychography involve illuminating the sample with a localized coherent probe, such as a focused , and systematically scanning it across the specimen in a raster or grid pattern. At each probe position, the transmitted or scattered wave interferes in the far field, producing a pattern that is recorded by a detector, forming a known as a ptychogram. The key to successful lies in the spatial overlap between adjacent probe positions, which introduces redundancy in the measured intensities; this shared across patterns allows iterative algorithms to recover the lost and resolve ambiguities inherent in intensity-only measurements. Significant spatial overlap between adjacent probe positions, typically more than 50%, is employed to ensure and , though lower overlaps as small as 30% can suffice with advanced algorithms and sufficient . Central to the technique is the probe illumination function, which describes the complex wavefront of the incident beam—often modeled as a Gaussian or Airy disk—and interacts multiplicatively with the sample's transmission function to form the exit wave. In a standard experimental setup, a coherent source generates the probe, which is directed onto the sample; the sample is translated laterally relative to the probe (or vice versa) using a scanning stage, while a distant detector captures the resulting speckle-like diffraction fringes without imaging optics. This configuration exploits the Fourier relationship between the exit wave and the recorded intensities. Ptychography has been implemented across diverse modalities, including for nondestructive of extended samples at facilities, for atomic-scale in , and optical using visible light for quantitative in biomedical applications.

The Phase Problem and Retrieval

In diffraction-based techniques, the phase problem arises because detectors measure only the of the scattered wave, which corresponds to the squared magnitude of the of the object's exit wave, while the information—essential for reconstructing the spatial distribution via inverse —is lost. This ambiguity prevents direct inversion to recover the complex-valued image of the sample, as multiple phases can yield the same pattern. Ptychography overcomes this limitation by employing a scanning illumination probe that overlaps across adjacent regions of the sample, generating a set of patterns with built-in . These overlapping measurements impose consistency constraints on the shared portions of the sample, allowing iterative algorithms to enforce agreement between the modeled and measured intensities across all patterns, thereby retrieving the lost information. The from overlaps, typically more than 50%, ensures uniqueness in the under the weak object approximation, where is predominantly forward and the probe is well-localized. Under the weak object approximation, the exit at the k-th position is modeled as the product \psi_k(\mathbf{r}) = P(\mathbf{r} - \mathbf{R}_k) \cdot O(\mathbf{r}), where P(\mathbf{r}) is the complex illumination function, O(\mathbf{r}) is the complex object transmission function, and \mathbf{R}_k denotes the lateral shift of the for the k-th measurement. The measured intensity for each position is then the modulus squared of the : I_k(\mathbf{u}) = \left| \mathcal{F} \left\{ \psi_k(\mathbf{r}) \right\} (\mathbf{u}) \right|^2, where \mathcal{F} denotes the and \mathbf{u} is the coordinate in the far-field plane. This formulation assumes a thin sample and paraxial , capturing the essential multiplicative interaction between and object. Phase retrieval in ptychography proceeds iteratively by projecting the current estimate of the exit wave onto two constraint sets: in space, the is replaced by the of the measured I_k(\mathbf{u}) while preserving the estimated ; in real space, the object's support is enforced by the known overlap regions, ensuring the reconstructed object remains consistent with the positions across all illuminations. These projections exploit the diversity introduced by the shifted probes to resolve ambiguities, converging to a solution that simultaneously refines both the object and probe functions.

Experimental Configurations

Far-Field Focused-Probe Ptychography

Far-field focused-probe ptychography employs a coherent illumination source, such as an or electron beam, that is focused onto the sample using a or plate to form a localized probe. The probe is raster-scanned across the specimen using piezoelectric translation stages, with the scan typically following a rectangular or hexagonal grid pattern. At each probe position, the transmitted or scattered wavefront diffracts, and the resulting intensity pattern is captured in the far field—corresponding to the regime—by a downstream detector, such as a () or pixel array detector. This setup ensures that the diffraction patterns are recorded at a sufficient distance from the sample to approximate plane-wave , enabling the collection of a ptychogram comprising multiple overlapping measurements. The in this generally features a Gaussian or intensity profile, determined by the focusing and , with the beam's finite size providing spatial confinement for localized illumination. Scanning overlaps between adjacent probe positions are crucial for data redundancy and are typically set to 60-80% to facilitate robust by linking information across measurements. Experimental parameters often include probe diameters of 10-100 nm, tailored to the and application—for instance, smaller probes for higher in hard regimes—while detector sizes are chosen to satisfy Nyquist sampling of the diffraction fringes, usually on the order of 10-50 μm per to capture speckle details without . This far-field approach offers significant advantages for high-resolution , achieving spatial resolutions limited primarily by the size and rather than aberrations or detector limitations, which is particularly beneficial for and modalities where traditional suffer from chromatic or low numerical apertures. It supports quantitative and of extended samples with tolerance to partial and , enabling sub-10 nm resolutions in practice. For example, at facilities, far-field focused-probe ptychography is routinely used for 2D of nanostructured materials, such as integrated circuits, yielding 20 nm with low dose.

Near-Field Ptychography

Near-field ptychography operates in the regime, where intensity patterns are recorded at small distances from the sample, typically on the order of millimeters to centimeters for X-rays or micrometers for electrons, without the need for focusing lenses. In this inline holography-based setup, a coherent probe illuminates an extended area of the sample, and the resulting between the scattered and unscattered waves is captured directly on a detector positioned close to the specimen. The sample is scanned laterally relative to the illumination, with overlapping probe positions providing redundancy for , enabling reconstruction of the complex-valued exit wave. This configuration is particularly sensitive to phase shifts induced by the sample due to the quadratic phase factors in the near-field propagation, which encode information about both amplitude and phase through self-interference. Unlike far-field methods, near-field ptychography accommodates thicker or extended objects by capturing propagation effects that reveal depth information, making it suitable for reconstructions via tomographic approaches or multi-distance measurements. For instance, it has been applied to optically thick specimens, such as a 46 μm uranium sphere, achieving quantitative phase imaging with resolutions approaching 1 μm. The forward model in near-field ptychography relies on the Fresnel to relate the sample's exit wave to the measured intensities. The propagated field at distance z from the object can be expressed in the as: \psi'(u,v) = \mathcal{F}^{-1} \left\{ H(u,v) \mathcal{F} \{\psi(r)\} \right\} where \mathcal{F} and \mathcal{F}^{-1} denote the and its inverse, respectively, \psi(r) is the exit wave in the spatial domain, and H(u,v) = \exp\left[i \pi \lambda z (u^2 + v^2)\right] is the with \lambda. This kernel-based facilitates efficient computational modeling of the process during . Recent advances have extended near-field ptychography to , particularly through full-field structured illumination schemes that replace focused probes with patterned beams to enhance information encoding and reduce times. In 2024, a configuration using a diffractive optical element to generate structured illumination demonstrated high-resolution phase of biological samples with improved signal-to-noise ratios. These developments incorporate kernels to model wave accurately, enabling applications in low-dose where radiation sensitivity is a concern, such as in cryo- of .

Fourier Ptychography

Fourier ptychography (FP) is a technique that extends the of conventional bright-field by using structured illumination with angled s, typically generated by an LED array positioned beneath the sample, to capture a series of low- intensity images. These images are then processed to reconstruct a high-, wide-field complex-valued image equivalent to that obtained with a high (NA) , without requiring mechanical scanning of the sample or probe. The setup replaces the standard light source in a with a programmable illuminator, such as a densely packed LED array, where each LED emits a at a specific angle to illuminate the sample, and a low-NA collects the resulting patterns or defocused images. This approach achieves synthetic NA values up to 0.5 or higher, enabling sub-micron over fields of view exceeding 100 times that of traditional high-NA systems. The underlying principle of FP relies on filling the Fourier space of the sample's exit wave through overlapping spectral "windows" provided by each illumination angle, allowing iterative phase retrieval algorithms to stitch these components into a coherent high-resolution reconstruction. Under plane-wave illumination at angle \theta_\ell, the exit wave from the sample, denoted as O(\mathbf{r}), is modulated and Fourier-transformed by the objective pupil function P(\mathbf{u}), where \mathbf{u} represents spatial frequencies. The intensity image for the \ell-th illumination is given by I_\ell(\mathbf{r}) = \left| \mathcal{F}^{-1} \left\{ P(\mathbf{u}) \cdot O(\mathbf{u} - \mathbf{u}_\ell) \right\} \right|^2, with \mathbf{u}_\ell = (\sin\theta_\ell / \lambda) \mathbf{\hat{k}} as the Fourier shift corresponding to the illumination wavevector, \mathcal{F}^{-1} the inverse Fourier transform, and \lambda the wavelength. The overlaps between adjacent spectral windows, typically 60-70% for optimal reconstruction, enable robust recovery of the full object spectrum via phase retrieval methods that enforce consistency across measurements. This process circumvents the phase problem by leveraging redundancy in the Fourier domain, yielding quantitative phase and amplitude information. Recent developments in include neural for whole-field high-resolution , as demonstrated in 2025 with the NePE-FPM method, which dynamically optimizes the pupil function using implicit neural representations and multi-resolution hash encoding to achieve continuous shifts and improved fidelity in off-axis regions, reducing artifacts in large-scale reconstructions. Applications in have advanced with FP systems integrated into standard microscopes, enabling gigapixel-scale quantitative phase of samples for automated , with resolutions approaching 0.5 \mum over fields of up to 4 mm². Unlike scanning-based ptychography , FP enables parallel acquisition of multiple images simultaneously or in rapid sequence, making it particularly suitable for live-cell and dynamic processes where is critical.

Bragg and Reflection Ptychography

Bragg ptychography extends ptychographic imaging to crystalline samples by leveraging Bragg diffraction in reflection geometry, enabling the reconstruction of three-dimensional strain fields and lattice distortions with nanoscale resolution. This approach combines the overlap constraints of conventional ptychography with the sensitivity of to atomic-scale displacements in periodic structures. In the experimental setup, a focused coherent probe is raster-scanned across the sample surface in overlapping positions, with the sample oriented at the Bragg angle relative to the incident beam. patterns are recorded on a far-field detector as the sample is angularly scanned through multiple positions along the rocking , typically in steps of 0.005° or finer, to capture a dataset analogous to a rocking for each probe position. This configuration allows for quantitative mapping of lattice strains down to $10^{-4} levels over fields of view up to several micrometers, with resolutions on the order of 40 in all dimensions. The reflection mode is particularly suited for surface-sensitive of opaque or thick crystalline materials, circumventing the limitations of geometries. The underlying principles rely on the phase retrieval of diffraction intensities to reconstruct the complex electron density and displacement field \mathbf{u}(\mathbf{r}) of the crystal lattice, where the retrieved phase \phi_{hkl}(\mathbf{r}) = \mathbf{Q}_{hkl} \cdot \mathbf{u}(\mathbf{r}) directly encodes distortions along the scattering vector \mathbf{Q}_{hkl}. The scattering vector \mathbf{q} must satisfy the Bragg condition |\mathbf{q}| = \frac{4\pi \sin \theta}{\lambda}, with \theta as the scattering angle and \lambda the X-ray wavelength, ensuring selective diffraction from specific lattice planes. Bragg ptychography has been widely applied in synchrotron X-ray imaging of , providing insights into evolution in structures like He-implanted tungsten foils. For example, Bragg ptychography has revealed nanoscale defects and s in heterostructures, such as InGaN/ nanowires and silicon-on-insulator devices, aiding the study of and epitaxial .

Multislice and Vectorial Variants

Multislice ptychography extends conventional ptychographic to three-dimensional by modeling the probe wave's through a series of thin sample slices, thereby incorporating multiple effects prevalent in thicker specimens. This approach divides the specimen into discrete parallel planes, each characterized by a t_n, and iteratively simulates wave evolution across these layers during the forward modeling process. The from the wave field \psi_n at slice n to \psi_{n+1} at the next slice is described by the equation \psi_{n+1} = P \cdot \mathcal{F}^{-1} \left\{ \mathcal{F} \{ \psi_n \} \cdot t_n \right\}, where P represents the free-space propagation operator between slices, and \mathcal{F} and \mathcal{F}^{-1} denote the Fourier transform and its inverse, respectively. This multislice framework enables simultaneous recovery of multiple axial planes with substantially lower data demands than traditional ptychographic tomography, making it suitable for strongly scattering samples beyond the weak-phase approximation. In microscopy applications, multislice ptychography has demonstrated atomic-resolution limits set by thermal lattice vibrations, even in multilayer materials where multiple scattering alters probe shape and causes dechannelling. Recent advancements integrate generative priors, such as diffusion models, into multislice ptychography reconstructions via posterior sampling techniques, yielding enhanced structural fidelity and robustness for atomic-scale 3D imaging of crystals. These methods achieve super-resolution by correcting probe aberrations and provide depth-resolved information with axial resolutions down to 2 and transverse resolutions of 0.7 , representing a 13.5-fold improvement in information content over conventional approaches. Vectorial ptychography variants overcome the scalar wave assumption of standard methods by accounting for the full vectorial , including probe states and aberrations that arise in high-numerical-aperture or anisotropic imaging scenarios. These techniques utilize the Jones matrix formalism to describe non-scalar wave-sample interactions, representing the specimen's response as a 2×2 complex matrix that modulates incident components. In vectorial ptychography, for example, variable-angle illumination facilitates the quantitative retrieval of the full Jones matrix, enabling high-resolution mapping of polarimetric properties like and dichroism in specimens. Extensions to further broaden applicability by modeling chiral or magneto-optical responses within the same framework. Multislice and vectorial ptychography find key applications in thick biological samples, where multiple obscures signals in traditional setups, and in magnetic materials, where vectorial sensitivity reveals domain structures and polarization-dependent .

Reconstruction Algorithms

Classical Iterative Methods

Classical iterative methods for ptychographic reconstruction address the problem by enforcing consistency between measured intensity data in the domain and estimated object support or probe constraints in the real domain through repeated projections. These algorithms, originating from general techniques, were adapted to ptychography to exploit overlapping illumination regions for improved convergence and robustness. The Error Reduction (ER) algorithm alternates between enforcing the measured amplitudes on the current estimate of the exit wave and applying real-space constraints to the inverse-transformed result. In each , the of the current object-probe estimate is adjusted to match the of the measured intensities (modulo phase), followed by an inverse and masking to the known region. This process reduces discrepancies iteratively but can stagnate in local minima without additional feedback mechanisms. To mitigate stagnation in ER, the Hybrid Input-Output (HIO) method introduces a feedback parameter to modify updates outside the support region. Developed by Fienup for general phase retrieval, HIO computes an output estimate \gamma after Fourier constraint enforcement, then updates the input estimate \psi as \psi' = \psi + \beta (\gamma - \psi), where \beta is a tunable feedback factor (typically 0.5–1.0) that blends the previous input with the projected output. Inside the support, the update directly replaces with \gamma; this hybrid approach promotes escape from stagnant solutions while maintaining consistency within constraints. In ptychography, HIO is often integrated sequentially with position-specific projections to handle overlapping data. The Difference Map (DM) projects between multiple sets to ensure global consistency across all probe positions. It models the as finding a fixed point in the of real-space overlap and modulus , using the difference between successive projections to update estimates: \psi_{n+1} = \psi_n + \alpha (P_2 P_1 - I) \psi_n, where P_1 and P_2 are projectors onto the respective sets, \alpha is a , and I is the . This , introduced for ptychography by et al., enables of patterns and simultaneous probe and object refinement, enhancing stability for extended specimens. The extended Ptychographical Iterative Engine (ePIE) iteratively updates both the object transmission function and the probe function by partitioning the overlap regions. Starting with an initial probe and object guess, for each scan position j, the exit wave \psi_j = O \cdot P_j (where O is the object and P_j the shifted probe) is Fourier-transformed, amplitudes enforced to match measurements, and inverse-transformed to yield an updated exit wave estimate \psi_j'. The object update in the overlap is then O'(\mathbf{r}) = O(\mathbf{r}) + \frac{\alpha_j}{P_j(\mathbf{r})} (\psi_j'(\mathbf{r}) - \psi_j(\mathbf{r})) for positions illuminated by probe j, with \alpha_j a step size (often 0.5–1.0); the probe is similarly refined across positions. This position-adaptive approach, building on the original , allows self-consistent retrieval without prior probe knowledge. A pseudocode overview for ePIE illustrates the iterative structure:
Initialize: object O, probe P, measured intensities I_j for each position j
For iteration k = 1 to N:
    For each scan position j:
        Compute exit wave ψ_j = O * shift(P, position_j)
        Compute [Fourier transform](/page/Fourier_transform) F(ψ_j)
        Enforce amplitudes: F'(q) = sqrt(I_j(q)) * exp(i arg(F(ψ_j)(q)))
        Inverse transform: ψ_j' = F^{-1}(F')
        Update object in overlap: O_new(r) = O(r) + (α / |P(r)|^2) * (ψ_j'(r) - ψ_j(r)) * P^*(r)  [for r in overlap_j]
        Update probe: P_new = average over j of [ (ψ_j'(r) - O * shift(P, j))(r) / O^*(r) ]  [normalized]
    O ← O_new; P ← P_new / norm(P_new)
End
Output: reconstructed O and P
These methods typically converge to sufficient accuracy within 50–200 iterations, depending on overlap fraction (60–80% recommended) and noise levels, with error metrics like dropping below 1% of initial values.

Advanced Computational Techniques

Advanced computational techniques in ptychography have evolved to address challenges in , computational efficiency, and scarcity, incorporating regularization strategies, integrations, and optimized hardware utilization. Regularization methods, such as (TV) priors, promote smoothness in reconstructions while preserving edges, effectively reducing in oversampled datasets. For instance, anisotropic TV regularization has been applied to sparsely sampled Fourier ptychography, improving robustness to by jointly estimating object and pupil functions. Similarly, sparsity priors enforce low-rank representations of the object or probe, aiding reconstruction from undersampled where overlap between scans is limited. These approaches build on classical iterative methods by adding to stabilize convergence in noisy environments. Neural networks have advanced through physics-informed frameworks, which embed forward models like the directly into network architectures to ensure physical consistency. For example, unsupervised (PINNs) accelerate reconstructions by 100- to 1000-fold compared to traditional methods, while maintaining in quantitative . Recent developments from 2023 to 2025 emphasize pre-trained models, such as adapting large convolutional networks to ptychographic , which enhances and reduces needs; one such strategy achieved superior reconstruction quality on diverse datasets by leveraging from ImageNet-pretrained backbones. Real-time reconstruction is facilitated by on-the-fly GPU , enabling immediate during experiments to adjust parameters dynamically. Multi-GPU implementations of algorithms like the multi-mode difference map () support , scaling to large datasets with minimal communication overhead and achieving reconstruction times of tens of seconds for high-resolution images. These techniques distribute gradient computations across nodes, outperforming single-GPU baselines in accuracy and speed for synchrotron-scale problems. Low-dose methods leverage Bayesian approaches to handle photon-limited regimes, using non-convex optimization to minimize required electron or doses by two orders of magnitude while preserving structural details in cryo-samples. In multislice ptychography, generative priors from models serve as regularizers, improving atomic-resolution reconstructions; 2025 benchmarks demonstrate that integrating such priors via diffusion posterior sampling yields sub-nanometer 3D maps from sparse data, with error rates reduced by up to 30% over unregularized baselines. A common formulation for these regularized reconstructions minimizes a loss function that balances data fidelity and prior constraints: L = \sum ||I_{\text{meas}} - |\mathcal{F}\{\psi\}|^2||^2 + \lambda R(\psi), where I_{\text{meas}} are measured intensities, \mathcal{F} denotes the , \psi is the exit wave, \lambda is a regularization , and R(\psi) is a prior such as TV or sparsity. Software tools like Ptychography 4.0 facilitate these advancements by providing efficient coordinate transforms for detector data to reconstruction space, supporting on-the-fly processing in GPU environments.

Advantages

Lensless Imaging and Resolution Enhancement

Ptychography operates as a lensless imaging modality, reconstructing the complex-valued sample transmission function directly from far-field diffraction patterns recorded by a detector, without the need for objective lenses that introduce aberrations such as spherical or chromatic distortions in conventional microscopy. This approach circumvents the limitations imposed by imperfect optics, where lens aberrations typically degrade resolution beyond the diffraction limit defined by the numerical aperture. Instead, the ultimate spatial resolution in ptychography is governed by the illuminating wavelength and the positional stability of the scanning probe, enabling diffraction-limited performance limited only by these fundamental factors. In ptychography, this lensless configuration has facilitated sub-nanometer s, with recent demonstrations achieving below 0.1 nm (sub-Ångström) for atomic-scale imaging in setups. Computational post-processing plays a central role in enhancement, iteratively refining both the incident probe wavefunction and the sample's and to recover high-frequency information that exceeds the capture limits of the detector or probe aperture. This process effectively inverts the forward model, extending beyond the classical limit by leveraging redundant overlapping measurements across scan positions. Compared to traditional lens-based , ptychography offers superior performance for radiation-sensitive specimens, as it distributes the illumination dose over multiple overlapping probes while computationally reconstructing a high-resolution from low-dose per-position , minimizing beam-induced damage. For example, atomic-resolution imaging of two-dimensional materials such as has been realized at without cryogenic cooling, achieving information-limited resolutions that surpass those of aberration-corrected electron microscopes under similar conditions.

Tolerance to Incoherence and Noise

Ptychography exhibits significant tolerance to partial in the illuminating beam, enabling reliable reconstructions even when the probe is imperfect. Algorithms model this partial by propagating the mutual function through the imaging system, where the measured in the plane is given by the of the fully coherent with the of the complex degree of \hat{\gamma}(q), expressed as I_{pc}(q) = I_{fc}(q) \otimes \hat{\gamma}(q). This approach accounts for the statistical stationarity of the mutual J(r_1, r_2), which depends only on the separation \Delta r = r_1 - r_2, allowing the degree of \mu(\Delta x, \Delta y) to be modeled as a \mu(\Delta x, \Delta y) = \exp\left[- ((\Delta x)^2 + (\Delta y)^2) / 2\sigma^2\right]. The overlapping scan positions in ptychography provide redundant constraints across multiple patterns, enhancing robustness to reduced (e.g., when the \sigma is less than 25% of the probe size), with quality improving for overlaps exceeding 40%. For scenarios involving a simple coherence factor \mu between field components, the intensity can be approximated as I = |\psi|^2 (1 + |\mu|^2 - 2 \operatorname{Re}(\mu \psi^* \psi)), where \psi represents the complex field amplitude, highlighting how partial modulates the observed signal. This modeling enables ptychographic algorithms to jointly retrieve the object and the coherence properties without prior knowledge of the beam's spatial or temporal , distinguishing it from traditional coherent techniques that degrade sharply under similar conditions. Ptychography also demonstrates robust handling of noise, particularly Poisson-distributed prevalent in - or -limited measurements. Reconstruction algorithms employ statistical methods, such as maximum-likelihood estimation, to optimize the likelihood of observed intensities under statistics, minimizing the impact of noise on convergence. This approach outperforms least-squares methods by directly incorporating the variance equal to the mean intensity, leading to more accurate and phase estimates even at low signal levels. For instance, in electron ptychography, low-dose at doses of approximately 35–49 per square has achieved resolutions of 5.8–8.4 for biological samples like apoferritin and sheaths, using iterative refinements that leverage the full 4D dataset. These noise-tolerant strategies enable operando imaging of beam-sensitive specimens, such as proteins in cryo-electron microscopy, where cumulative is minimized by distributing the dose across overlapping probes (e.g., 0.5 electrons per square per position), preserving structural integrity during dynamic processes. Advanced computational techniques, including those with maximum-likelihood objectives, further enhance this robustness by adapting to varying noise regimes.

Self-Calibration and Multiple Scattering Inversion

Ptychography's self-calibration capability arises from its ability to jointly reconstruct the illumination probe function and the sample object function without requiring prior knowledge of either, thereby correcting for aberrations and instabilities inherent in the imaging system. This process leverages the redundancy in overlapping patterns to iteratively refine both components, ensuring that errors in the probe—such as distortions or focus shifts—are compensated during the . Seminal work demonstrated that this joint estimation enhances fidelity by adapting to experimental imperfections, such as probe aberrations, without external references. A key implementation of this self-calibration is the extended ptychographic iterative engine (ePIE), which updates the probe and object estimates in a coupled manner to minimize discrepancies between measured and simulated data. The probe update rule in ePIE is expressed as P' = P + \alpha \left( \frac{\psi_\mathrm{meas}}{O} - P \right), where P is the current probe estimate, \alpha is an adjustable step size parameter controlling the update strength, \psi_\mathrm{meas} represents the measured exit wave derived from the pattern after modulus projection, O is the object estimate, and the calculated exit wave is \psi_\mathrm{calc} = P \cdot O. This formulation allows the algorithm to dynamically correct probe variations across positions, improving overall quality and in configurations involving strong . The benefits include robustness to experimental instabilities, such as vibrations or thermal drifts, which would otherwise degrade reconstructions in traditional methods. To address multiple scattering in thick samples, where forward propagation involves complex interactions beyond the single-scattering approximation, ptychography employs multislice models that divide the specimen into thin parallel slices and simulate wave propagation through each successive layer. This approach inverts the multiple-scattering forward model by iteratively optimizing the transmission functions of all slices against the measured data, accurately recovering phase and amplitude even in volumetrically extended objects. Pioneering demonstrations showed that multislice ptychography resolves features in thick samples at resolutions limited only by the probe size, overcoming depth-of-field constraints in conventional techniques. The integration of multislice enables tomographic reconstructions in ptychography by combining multi-slice inversions with sampling, yielding high-fidelity volumetric images with reduced requirements compared to full tomographic scans. This capability adapts to scattering-dominated regimes, such as in or imaging of dense materials, and supports of internal structures without destructive sectioning. Overall, these self-calibration and multiple-scattering inversion strategies make ptychography particularly suited for imaging dynamic or aberrant systems, enhancing its utility in high-resolution volumetric studies.

Limitations

Computational and Data Requirements

Ptychography reconstruction involves processing high-dimensional datasets, particularly in 4D- modalities where each position yields a full pattern, resulting in datasets significantly larger than those from conventional imaging. For large-scale , such as those covering extended fields of view at high resolution, data volumes routinely exceed 100 GB, demanding substantial storage and computational resources. Iterative algorithms, essential for and enforcing overlap constraints, require accelerated hardware like multi-GPU systems to handle the matrix operations and transforms across thousands of positions. A primary challenge arises from memory demands for storing overlap regions between adjacent illuminations, where consumption scales quadratically with the number of scan points in certain implementations, limiting reconstructions to smaller datasets without . Real-time processing, desirable for experimental feedback, remains constrained even with 2025 advancements in GPU-accelerated algorithms, as full iterations on large datasets can take minutes to hours depending on overlap ratios and hardware. These limitations often necessitate compromises, such as reducing density or overlap to fit within available , which can for feasibility. To mitigate these burdens, data reduction techniques preprocess diffraction patterns by discarding redundant or low-information pixels prior to , as demonstrated by a 2023 Argonne National Laboratory algorithm that achieves up to 80% data compression while preserving image quality. Such methods not only lower needs but also accelerate in iterative solvers, enabling processing on standard clusters. Overall, ptychography's computational trade-offs require balancing higher resolution—demanding denser scans and more overlaps—against processing speed, often guided by application-specific priorities like throughput in experiments.

Experimental and Practical Challenges

Ptychography experiments demand exceptional mechanical to achieve sub-nanometer scan precision, as even minor between the beam and sample can introduce incoherent blurring in patterns, severely limiting . For instance, random high-frequency degrade pattern visibility and cause , reducing achievable resolutions to tens of nanometers in affected setups. This requirement for nanometer-scale in scanning stages and environmental isolation poses significant hurdles in laboratory implementations, particularly for long-duration scans at facilities. Beam damage represents a critical , especially for biological specimens, where high-energy illumination induces structural that compromises sample integrity and resolution. In cryo-ptychography, from or beams limits exposures, often necessitating cryogenic conditions to preserve frozen-hydrated states, yet low coherent flux still weakens signal-to-noise ratios in diffraction data. This trade-off forces researchers to balance dose efficiency with imaging quality, as insufficient flux leads to noisy patterns that hinder . Experimental setups in ptychography are inherently complex, requiring precise alignment for variants like Bragg or multislice configurations to ensure accurate interslice distances and focus-to-sample positioning. At synchrotrons, achieving high numerical apertures with such as multilayer Laue lenses demands coherent illumination and cleaning via pinholes, while the high cost and competitive access to these facilities restrict widespread adoption. Misalignments by even 10% can distort the effective Fresnel number, degrading fidelity in thick or in-situ samples. Recent advancements have highlighted ongoing issues with partial coherence in emerging sources, particularly in terahertz ptychography, where low signal-to-noise ratios and source instabilities challenge under low-overlap conditions. In 2025 experiments, terahertz setups using quantum cascade lasers faced difficulties with convergent illumination and noise, limiting resolutions to around 3.75 wavelengths despite partial lengths of tens of micrometers. To mitigate these hurdles, hybrid experimental setups combining multiple illumination modes or have been developed, enhancing and in multislice configurations without relying solely on access. Additionally, AI-assisted design tools like Ptychoscopy, introduced in 2025, streamline parameter optimization—such as probe convergence angles and defocus values—via user-friendly interfaces that incorporate calibrations, guiding setups toward higher dose efficiency and . These tools evaluate sampling impacts in real and , reducing trial-and-error in complex alignments.

Applications

X-ray and Synchrotron Imaging

Ptychography has emerged as a powerful technique for high-resolution imaging in and facilities, enabling nanoscale structural and chemical analysis of materials without the limitations of traditional lenses. At sources, the coherent beams produced facilitate lensless , achieving resolutions below 10 nm while penetrating samples up to several micrometers thick, which is essential for studying bulk materials like semiconductors and devices. This arises from the high energy of hard s (typically 5-20 keV), allowing non-destructive imaging of extended, heterogeneous samples that would be challenging with electron-based methods. In battery research, operando ptychography provides dynamic nanoscale insights into electrochemical processes, such as phase transformations and diffusion in lithium- cells. For instance, hard ptychography has been applied to thin-film all-solid-state , revealing morphological changes and evolution during charging cycles with sub-20 nm , enabling real-time monitoring under operational conditions without significant beam-induced damage. This approach supports the design of improved electrodes by visualizing nanoscale degradation mechanisms . Hyperspectral 3D ptychotomography extends these capabilities by combining ptychographic with tomographic scanning and energy-dispersive detection, yielding volumetric chemical maps of complex materials. Recent implementations at beamlines have demonstrated 3D of particles, resolving elemental distributions and oxidation states across volumes exceeding 10 μm³ with resolutions around 15 . Such techniques leverage illumination to capture multi-energy patterns in a single acquisition, reducing scan times for thick samples. To address imaging speed for larger fields of view, multibeam X-ray ptychography employs nano-lithographically patterned apertures to generate multiple coherent probes simultaneously, accelerating by factors of 10-100 while maintaining nanoscale . Demonstrated at energies up to 20 keV, this method has imaged extended structures, such as test patterns, in minutes rather than hours, making it suitable for time-resolved studies of dynamic processes in thick samples. Key examples include strain mapping in heterostructures, where Bragg ptychography quantifies distortions in silicon-on-insulator devices with 5-10 nm , revealing relaxation patterns at interfaces critical for device performance. Similarly, for defect analysis, ptychographic uncovers buried 3D voids and dislocations in films, providing quantitative metrics on defect densities that influence optoelectronic properties. These applications highlight ptychography's role in for non-destructive . Recent developments in spectroscopic ptychography setups further enable mapping by integrating energy-dispersive detectors with focused beams, allowing simultaneous retrieval of phase, amplitude, and absorption spectra. Optimized , such as Kirkpatrick-Baez mirrors tuned for polychromatic beams, have achieved hyperspectral contrast in battery materials, distinguishing distributions with sub-20 nm precision and probing chemical across energy ranges of 1-2 keV. This advancement supports comprehensive analysis of multi-component systems in environments.

Electron Microscopy

Electron ptychography in (TEM) leverages four-dimensional scanning TEM (4D-STEM) data to achieve sub-angstrom imaging of materials, enabling the reconstruction of complex transmission functions without relying on aberration-corrected lenses. This technique scans a focused probe across overlapping regions of a sample, capturing patterns with pixelated detectors, and computationally retrieves both and information to surpass the resolution limits of conventional TEM. Unlike methods, electron ptychography operates in with high-energy electrons, providing unparalleled atomic-scale detail for solid-state materials while navigating challenges like multiple . A primary application is atomic-scale imaging on conventional TEM setups, where electron ptychography has demonstrated resolutions down to 0.5 Å on standard instruments, democratizing high-end capabilities for materials characterization. In 4D-STEM configurations, it excels at visualizing defects such as single-atom vacancies in two-dimensional materials like monolayer transition metal dichalcogenides, revealing strain fields and lattice distortions with atomic precision. For instance, direct observation of sulfur vacancies in MoS₂ has highlighted local charge redistribution and electronic structure perturbations around defects. These capabilities are particularly valuable in materials science for studying nanoscale imperfections that influence properties like conductivity and mechanical strength. Advances in electron ptychography include low-dose protocols tailored for beam-sensitive materials, such as metal-organic frameworks (MOFs) and covalent organic frameworks (COFs), where electron doses below 10 electrons per Ų preserve fragile structures during imaging. This approach has enabled atomic-level visualization of linker orientations and pore architectures in beam-sensitive samples, minimizing that plagues traditional TEM. Recent innovations incorporate multislice propagation models augmented by diffusion-based generative priors, as developed at , to enhance 3D reconstructions of thick specimens by iteratively refining atomic positions through learned crystal structure distributions. These methods improve depth resolution to ~2.5 nm while accounting for multiple scattering effects. Exemplary uses encompass reconstruction, where ptychographic has mapped buried heterointerfaces in van der Waals materials like hBN-graphene stacks, resolving atomic layering and twist angles with 0.57 lateral accuracy. In magnetic materials, it facilitates mapping by integrating Lorentz effects into phase reconstructions, imaging nanoscale magnetic textures in thin films (~4 atoms thick) to reveal walls and skyrmion-like configurations. The practical implementation relies on high-speed pixelated detectors, such as direct detectors, which capture full patterns at rates exceeding 1000 frames per second, making electron ptychography feasible on routine TEM platforms and enabling acquisition for dynamic studies.

Optical and Emerging Modalities

Optical ptychography, particularly through (FPM), has advanced biomedical imaging by enabling high-, wide-field visualization without the need for high (NA) lenses, which traditionally limit the field of view in conventional . This technique synthesizes a high-resolution image from multiple low-resolution images captured under varying illumination angles, making it suitable for applications like where large tissue samples must be scanned efficiently. A 2025 review from the highlights FPM's role in , demonstrating its ability to achieve sub-micron resolution over millimeter-scale fields, facilitating automated of unstained slides for cancer detection and reducing the need for costly high-NA objectives. In live cell imaging, optical ptychography provides label-free, quantitative contrast that reveals cellular dynamics without from stains or high-intensity light. Early demonstrations showed ptychography enabling high-contrast imaging of unstained live cells, such as neurons and stem cells, by reconstructing both and from patterns, achieving resolutions down to 400 . This approach supports real-time monitoring of cellular processes like and division, as seen in commercial systems that use ptychography for automated tracking in time-lapse studies. Emerging modalities extend ptychography to (THz) frequencies, leveraging the non-ionizing nature of THz waves for non-destructive testing of materials and biological samples. THz ptychography reconstructs complex-valued images from intensity measurements, offering penetration depths of millimeters in non-conductive materials, ideal for inspecting composites or layered structures without damage. A 2022 study demonstrated THz ptychography for imaging tissues on slides, achieving efficient large field-of-view reconstructions with sub-wavelength resolution using optimized scanning strategies. Recent developments incorporate to create generalizable models for ptychography reconstruction across optical and emerging modalities, reducing computational demands and improving adaptability to diverse experimental conditions. A probe-centric framework, introduced in 2025, trains a single physics-informed to handle unseen datasets from multiple setups, enabling feedback and steering during experiments in both visible and THz regimes. This approach enhances reconstruction fidelity for dynamic samples, such as live cells or THz-inspected materials, by generalizing beyond specific illumination or probe configurations.

History

Origins in Crystallography

Ptychography originated as a to address the phase problem in , where the phases of diffracted waves cannot be directly measured, only their intensities, hindering the reconstruction of atomic structures via Fourier synthesis. In 1969, Walter Hoppe proposed using a tightly focused coherent to illuminate overlapping regions of a crystalline sample, producing diffraction patterns with broadened and interfering Bragg peaks that provide the necessary redundancy for . This approach aimed to solve the phase problem by exploiting the interference from translated positions, where each pattern shares common structural information with its neighbors. The term "ptychography," derived from the Greek word for "folding," was coined by Rainer Hegerl and Walter Hoppe in 1970 to describe this technique, emphasizing the "folding" of overlapping data to enable evaluation in generalized scenarios for electron microscopy. Their work built on Hoppe's initial idea by formalizing the dynamic theory of analysis through in an inhomogeneous primary wave field, highlighting how spatial overlap creates data redundancy to uniquely determine the . Despite these theoretical foundations, practical was hindered by the era's computational limitations, including insufficient and power for the iterative algorithms required to handle the highly redundant datasets from multiple overlapping patterns. Hoppe himself later described the as a "nearly forgotten old idea" due to these challenges. Key milestones in the included theoretical papers exploring the redundancy in ptychographic data, such as extensions by Hoppe and collaborators that analyzed how overlap ratios ensure in phase reconstruction, laying groundwork for future algorithmic developments.

Development of Reconstruction Algorithms

Building on earlier theoretical work, including formalizations by Rodenburg and Bates in 1992 for electron microscopy and Chapman in 1996 for X-rays, John Rodenburg and collaborators in the 1990s pioneered algorithmic developments for ptychography in coherent imaging, adapting the error reduction (ER) and hybrid input-output (HIO) phase retrieval methods—originally formulated by Fienup for single diffraction patterns—to accommodate the redundant, overlapping data from scanned probes in ptychographic setups. These adaptations addressed stagnation issues in ER by leveraging HIO's feedback mechanism outside the object support, enabling initial theoretical and experimental demonstrations in electron microscopy that exceeded conventional resolution limits. A key early milestone was the 1998 experimental validation using Wigner distribution deconvolution alongside these iterative methods, which recovered both specimen structure and illuminating wave from electron diffraction patterns of crystalline samples. The transition from theoretical frameworks to practical implementations accelerated in the early 2000s, culminating in the introduction of the ptychographical iterative engine (PIE) algorithm in 2004 by Rodenburg and colleagues. This novel phase retrieval approach extended ER principles to multiple overlapping illuminations, iteratively updating the object estimate across probe positions while enforcing measured intensities, thus overcoming limitations of prior non-iterative deconvolution techniques and achieving lensless transmission microscopy with a movable aperture. The PIE method demonstrated robustness to noise and partial coherence, marking a shift toward broader applicability beyond crystalline electron imaging. A significant advancement came in 2009 with the extended ptychographical iterative engine (ePIE) developed by Maiden and Rodenburg, which explicitly handled unknown probe functions by simultaneously refining both the specimen transmission and illumination parameters during iterations. This refinement improved convergence for noisy datasets and incomplete overlaps, building on by incorporating probe self-correction via error metrics in the overlap regions. These algorithmic innovations facilitated proofs-of-concept in and electron modalities, such as the 2007 hard demonstration that imaged extended objects at synchrotron sources without lenses.

Modern Advances and Adoption

In the 2010s, ptychography experienced significant growth through the introduction of Fourier ptychography, which extended the technique to visible light for wide-field, high-resolution imaging without specialized optics. This boom was catalyzed by Zheng et al.'s 2013 demonstration of Fourier ptychographic microscopy (FPM), achieving resolutions beyond the diffraction limit of conventional microscopes using iterative on low-resolution images under variable illumination. Concurrently, electron ptychography gained traction in (STEM) via 4D-STEM, enabling phase contrast and structural mapping at the atomic scale, with early adoptions in for and orientation analysis by the late 2010s. The 2020s marked a shift toward integration of , particularly neural networks, to accelerate reconstructions and handle noisy or incomplete in ptychographic . Physics-informed deep neural networks have enabled generalizable reconstructions across diverse experimental conditions, reducing times from hours to seconds while maintaining . ptychography emerged as a key advance, with edge-computing frameworks allowing on-the-fly inversions during , as shown in 2023 workflows for that process streams at facilities. In electron , low-dose protocols advanced in 2024, achieving sub-nanometer in cryo-samples with minimal beam exposure to preserve beam-sensitive biological structures like proteins. Adoption has expanded from research laboratories to broader scientific infrastructure, facilitated by open-source tools that democratize experimental design and . The 2025 release of Ptychoscopy, a Python-based software, streamlines optimization for ptychography, aiding users in achieving optimal overlap and probe conditions for high-quality reconstructions. Hyperspectral X-ray ptychography has seen increased use in synchrotron-based materials , enabling simultaneous structural and chemical mapping in with broadband detectors. Key milestones include routine sub-nanometer resolutions in electron ptychography by 2024, applied to diverse samples from semiconductors to biomolecules, surpassing traditional limits without aberration correction. Extensions to frequencies have further broadened applicability, with untrained neural networks in 2025 enabling in non-visible regimes for imaging complex media like textiles or concealed objects.

References

  1. [1]
    Ptychography: A brief introduction - Rodenburg - Wiley Online Library
    Aug 20, 2025 · Unlike conventional microscopy with lenses, ptychography does not provide a real or virtual image that can be seen directly. Instead, it uses a ...<|control11|><|separator|>
  2. [2]
    Optical ptychography for biomedical imaging - PubMed Central - NIH
    Ptychography is an enabling microscopy technique for both fundamental and applied sciences. In the past decade, it has become an indispensable imaging tool.
  3. [3]
    None
    Below is a merged summary of the ptychography definitions and principles from the Rodenburg and Maiden chapter, consolidating all information from the provided segments into a comprehensive response. To retain maximum detail and clarity, I will use a combination of narrative text and a table in CSV format for key concepts that benefit from structured comparison (e.g., modalities, overlap details). The response avoids redundancy while ensuring all unique points are included.
  4. [4]
    Ptychography and Related Diffractive Imaging Methods
    Ptychography is a nonholographic solution of the phase problem. It is a method for calculating the phase relationships among different parts of a scattered ...
  5. [5]
    High-Resolution Scanning X-ray Diffraction Microscopy | Science
    Jul 18, 2008 · We demonstrate a ptychographic imaging method that bridges the gap between CDI and STXM by measuring complete diffraction patterns at each point of a STXM scan.
  6. [6]
  7. [7]
    Near-field ptychography: phase retrieval for inline holography using ...
    May 31, 2013 · The experimental procedure entails combining multiple diffraction measurements collected as a sample is scanned through a localized illumination ...
  8. [8]
  9. [9]
    X-Ray Near-Field Ptychography for Optically Thick Specimens
    Jan 21, 2015 · In this paper, we demonstrate that near-field ptychography can be used to efficiently perform phase retrieval on a uranium sphere with a diameter of about 4 6 ...
  10. [10]
    Near-field ptychography using lateral and longitudinal shifts
    Jul 31, 2015 · Here we present a generalized ptychography approach to simultaneously reconstruct object and probe in the optical near-field.
  11. [11]
    Near-field electron ptychography using full-field structured illumination
    In this study, we propose a new configuration for near-field ptychography for electron beams. It utilizes full-field and structured illumination generated by a ...
  12. [12]
    Efficient large field of view electron phase imaging using near-field ...
    In this paper we introduce a different approach based on near-field ptychography, where the focussed beam is replaced by a wide-field, structured illumination.
  13. [13]
    A phase space model of Fourier ptychographic microscopy
    A new computational imaging technique, termed Fourier ptychographic microscopy (FPM), uses a sequence of low-resolution images captured under varied ...
  14. [14]
    Whole-field, high-resolution Fourier ptychography with neural pupil ...
    Oct 6, 2025 · The NePE-FPM integrates the physical forward model with data-driven optimization, enabling high-fidelity reconstruction of off-axis areas ...
  15. [15]
    Fourier ptychography microscopy for digital pathology
    Jun 24, 2025 · A standard Fourier ptychography microscopy setup differs from a normal microscope by replacing a single light source with an LED array capable ...
  16. [16]
  17. [17]
    Multi-slice ptychographic tomography | Scientific Reports - Nature
    Feb 1, 2018 · Multi-slice ptychography can handle multiple-scattering thick specimens and has a much smaller data requirement than ptychographic tomography.
  18. [18]
    Multi-slice ptychography with large numerical aperture multilayer ...
    The multi-slice ptychography approach includes multiple scattering into the reconstruction engine, and recovers several axial planes simultaneously. 3D ...
  19. [19]
    Electron ptychography achieves atomic-resolution limits set by ...
    May 21, 2021 · Multislice electron ptychography provides quantitative phase information, with the phase increasing linearly as more layers are added into the ...
  20. [20]
    Improving Multislice Electron Ptychography with a Generative Prior
    Multislice electron ptychography (MEP) reconstructs crystal structures. MEP-Diffusion, a diffusion model, is used to enhance reconstruction quality, achieving ...
  21. [21]
    Multi-slice electron ptychographic tomography for three-dimensional ...
    Multi-slice ptychographic electron tomography allows 3D imaging beyond depth of field limits, achieving 2 Å axial and 0.7 Å transverse resolution, a 13.5-fold ...
  22. [22]
    Quantitative Jones matrix imaging using vectorial Fourier ... - NIH
    Feb 14, 2022 · Vectorial Fourier ptychography (vFP) is a microscopic imaging technique using variable-angle illumination to recover the complex Jones matrix ...
  23. [23]
    Extending the capabilities of vectorial ptychography to circular ...
    Sep 26, 2023 · Finally, Bragg and off-Bragg CLC films were investigated using vectorial ptychography with the improved polarization scheme. Figure 4 (a) ...
  24. [24]
    Phase retrieval algorithms: a comparison - Optica Publishing Group
    J. R. Fienup, "Phase retrieval algorithms: a comparison," Appl. Opt. 21 ... (Society of Photo-Optical Instrumentation Engineers, Bellingham, Wash., 1982), to be ...
  25. [25]
    An improved ptychographical phase retrieval algorithm for diffractive ...
    The ptychographical iterative engine (or PIE) is a recently developed phase retrieval algorithm that employs a series of diffraction patterns recorded as a ...
  26. [26]
    [PDF] Iterative Algorithms for Ptychography - White Rose eTheses Online
    Iterative computational loop. This computational method is called Error Reduction (ER) [31]. By applying the constraints sequentially and iteratively, the ...
  27. [27]
    Combining ptychographical algorithms with the Hybrid Input-Output ...
    In this article we combine the well-known Ptychographical Iterative Engine (PIE) with the Hybrid Input-Output (HIO) algorithm.
  28. [28]
    Probe retrieval in ptychographic coherent diffractive imaging
    Ptychography is a coherent diffractive imaging method that uses multiple diffraction patterns obtained through the scan of a localized illumination on the ...
  29. [29]
    A computational framework for ptychographic reconstructions - PMC
    Notable examples are the difference map (DM) [7,26], the relaxed averaged alternating reflections (RAAR) algorithm [27] and other similar formulations [28].
  30. [30]
    [PDF] Efficient Algorithms for Ptychographic Phase Retrieval
    In this paper, we analyze some of the existing methods for solving ptycho- graphic phase retrieval problem from a numerical optimization point of view. In.
  31. [31]
    Introduction to electron ptychography for materials scientists
    Sep 25, 2024 · Electron ptychography is a computational imaging method that utilizes the rich information in four-dimensional scanning transmission electron microscopy ...
  32. [32]
    Time-domain ptychography | Phys. Rev. A
    Feb 17, 2015 · The spatial resolution is limited by the positioning accuracy, the stability of the entire setup, and by the angular range of scattered wave ...
  33. [33]
    Sub-ångström resolution ptychography in a scanning electron ...
    Oct 14, 2025 · Achieving sub-ångström (<1 Å) resolution in electron microscopy typically requires a high-energy (>30 keV) beam and a transmission electron ...
  34. [34]
    Hard-X-Ray Lensless Imaging of Extended Objects | Phys. Rev. Lett.
    Jan 18, 2007 · We demonstrate a hard-x-ray microscope that does not use a lens and is not limited to a small field of view or an object of finite size.Missing: seminal | Show results with:seminal
  35. [35]
    Atomically resolved imaging of radiation-sensitive metal-organic ...
    Jan 22, 2025 · Electron ptychography, recognized as an ideal technique for low-dose imaging, consistently achieves deep sub-angstrom resolution at electron ...
  36. [36]
  37. [37]
  38. [38]
    Ptychoscopy: a user friendly experimental design tool for ptychography
    Jul 10, 2025 · These can be set up and named according to, for instance, the lens settings and apertures used on a given microscope. The tool automatically ...
  39. [39]
    Streaming Large-Scale Microscopy Data to a Supercomputing Facility
    Other microscopy facilities are installing similar high frame rate detectors with the ability to routinely generate > 100 GB datasets (Chatterjee et al., 2021; ...Background · Microscope Stability... · Stability Experiment
  40. [40]
    [PDF] High-Performance Multi-Mode Ptychography Reconstruction ... - OSTI
    Aug 8, 2018 · We report an accelerated version of the multi-mode difference map algorithm for ptychography reconstruction using multiple distributed GPUs.
  41. [41]
    Live Processing of Momentum-Resolved STEM Data for First ...
    The memory consumption of the current SSB ptychography implementation scales as formula with the number of scan points at a constant aspect ratio since the ...
  42. [42]
    Ptychographic reconstructions performed in real time and offline ...
    Apr 26, 2025 · Live iterative ptychography with projection-based algorithms. In ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and ...Missing: ANL | Show results with:ANL
  43. [43]
    New algorithm dramatically reduces computational cost of x-ray ...
    Mar 6, 2024 · New algorithm dramatically reduces computational cost of x-ray ptychography. Key lies in removing unneeded data before image reconstruction ...
  44. [44]
    Argonne's New AI Application Reduces Data Processing Time by ...
    Apr 26, 2024 · By using machine learning on individual X-ray diffraction patterns, the workflow eliminates the need for the usual stringent overlapping ...<|control11|><|separator|>
  45. [45]
    Scalable and accurate multi-GPU-based image reconstruction of ...
    Mar 29, 2022 · In this paper, we provide an optimized intranode multi-GPU implementation that can efficiently solve large-scale ptychographic reconstruction problems.
  46. [46]
    Suppressing system instability in ptychography using least-squares ...
    Jun 26, 2025 · In this paper, we propose a novel purely algorithmic approach to address the impact of random vibration between the beam and the sample induced ...
  47. [47]
    [PDF] Investigating Ptychographic Phase Retrieval in Terms of Vibration ...
    Jan 16, 2024 · Thibault, et al., introduced the Difference Map (DM) algorithm for ptychography in 2009 [51]. During that time, ePIE did not exist yet. The DM ...<|separator|>
  48. [48]
    Advances and challenges in cryo ptychography at the ... - NIH
    This parallel reconstruction method allows one to do real-time data analysis, which is extremely helpful to guide the running ptychography experiment. FLY-SCAN ...
  49. [49]
    Cryo-electron ptychography: Applications and potential in biological ...
    Cryo-electron ptychography is an alternative method for characterizing biological systems under low-fluence conditions, potentially allowing 3D reconstruction ...<|separator|>
  50. [50]
    X-ray near-field multi-slice ptychography for in-situ imaging - Nature
    Sep 1, 2025 · Our work extends near-field X-ray ptychography to sub-50 nm spatial resolution by using multilayer Laue lenses. These high numerical aperture ...
  51. [51]
    Ptychographic X-ray computed tomography at a high-brilliance X-ray ...
    Jan 8, 2019 · After testing the experimental setup with the Siemens star, finer alignment was carried out on the butterfly wing scale, with particular ...
  52. [52]
    Terahertz ptychography enabled by untrained physics-driven neural ...
    Aug 7, 2025 · All the objects were raster scanned by the probe with a step size of 1.2 mm (overlap = 81%). A total of 9 x 9, 10 x 10, and 20 x 10 ...
  53. [53]
    Review of partially coherent diffraction imaging - ResearchGate
    Aug 6, 2025 · Partially coherent beams with low spatial coherence have many advantages, including strong anti-turbulence ability [18], speckle noise ...
  54. [54]
    Ptychoscopy: a user friendly experimental design tool for ptychography
    Jul 10, 2025 · PtychoScopy simplifies experimental design and guides the researchers across diverse scientific fields in setting up successful ptychographic ...Missing: AI | Show results with:AI
  55. [55]
    High-Resolution Multislice X-Ray Ptychography of Extended Thick ...
    Feb 4, 2014 · We report the first demonstration of hard x-ray ptychography using a multislice approach, which can solve the problem of the limited spatial resolution.
  56. [56]
    Dark-field X-ray ptychography: Towards high-resolution imaging of ...
    Oct 13, 2016 · Observation of thick samples with high ... In contrast, hard X-rays can probe a thick specimen owing to their high penetration power.
  57. [57]
    Broadband Ptychotomography with a Hyperspectral Detector
    Jun 27, 2025 · In this work, we design an optimized broadband spectroscopic ptychography setup and use it to perform 3D hyperspectral imaging of particles of battery material.<|separator|>
  58. [58]
    X‐Ray Multibeam Ptychography at up to 20 keV: Nano‐Lithography ...
    Jun 23, 2024 · The challenge of performing ptychography at high energy and with many parallel beams must be overcome to extract the full advantages for ...
  59. [59]
    Strain Imaging of Nanoscale Semiconductor Heterostructures with X ...
    Apr 23, 2014 · Bragg projection ptychography is an x-ray imaging technique capable of mapping lattice perturbations in single crystal thin films with nanoscale ...
  60. [60]
    Strain in a silicon-on-insulator nanostructure revealed by 3D x-ray ...
    May 18, 2015 · Here, we demonstrate in details how x-ray Bragg ptychography can be used to quantify in 3D a displacement field in a lithographically patterned ...
  61. [61]
    [PDF] X-ray Ptychographic Tomography Reveals Buried 3D Structural ...
    X-ray ptychographic imaging reveals 3D structural defects in perovskites, including void defects, using a 3D rotation scan to reconstruct the structure.
  62. [62]
    Optics for broadband x-ray ptychography - AIP Publishing
    Aug 22, 2025 · The typical x-ray ptychography setup employs a Fresnel zone plate (FZP) for this, but FZPs are diffractive optics, and the chromaticity ...
  63. [63]
    Affordable high-resolution imaging with electron ptychography - 2024
    Feb 27, 2024 · Using electron ptychography, scientists have achieved record-breaking microscopic resolution on conventional transmission electron microscopes.Missing: scale | Show results with:scale
  64. [64]
    Direct observation of single-atom defects in monolayer two ... - Nature
    Jan 2, 2024 · This has also posed a challenge for electron ptychography, a subset of 4D STEM techniques, which had been suffering from inadequate hardware ...
  65. [65]
    Atomic-level imaging of beam-sensitive COFs and MOFs by low ...
    Mar 21, 2024 · Recent advances in electron ptychography have enabled the spatial resolution of TEM characterization to reach 23 picometers, allowing for ...
  66. [66]
    [PDF] Three-dimensional structure of buried heterointerfaces revealed by ...
    Jul 8, 2024 · Multislice ptychography (MSP) determines 3D structure of buried heterointerfaces, achieving 0.57 Å lateral and 2.5 nm depth resolution.
  67. [67]
    Electron Ptychography and Aberration-Corrected 4D-STEM for ...
    Jul 24, 2024 · As a demonstration, we imaged magnetic domain structures in embedded cobalt films where the magnetic layers was only ∼4 atoms thick (Figure 1B).
  68. [68]
    Ptychography – a label free, high-contrast imaging technique for live ...
    Aug 6, 2013 · The use of ptychography can significantly improve how we visualise, analyse and study cells. Unstained cells absorb very little light and are ...
  69. [69]
    Live Cell Imaging - Automated Cell Tracking and Analysis with ...
    Livecyte's patented technology uses a quantitative phase imaging (QPI) technique known as Ptychography. Phasefocus was awarded a Microscope Today Innovation ...About :: Phase Focus · Stem Cell Imaging · Cell Motility · T-Cell Killing Assay<|control11|><|separator|>
  70. [70]
    Terahertz ptychography with efficient FOV for breast cancer tissue ...
    May 20, 2022 · We present two optimization strategies that allow THz ptychography to achieve efficiently a large field of view (FOV) with high resolution.
  71. [71]
    Towards generalizable deep ptychography neural networks - arXiv
    Sep 29, 2025 · The proposed approach enables training of experiment-steering models that provide real-time feedback under dynamic experimental conditions.
  72. [72]
  73. [73]
  74. [74]
  75. [75]
  76. [76]
    Wide-field, high-resolution Fourier ptychographic microscopy - Nature
    Jul 28, 2013 · We report an imaging method, termed Fourier ptychographic microscopy (FPM), which iteratively stitches together a number of variably illuminated, low- ...Missing: paper | Show results with:paper
  77. [77]
    Subsampled STEM-ptychography | Applied Physics Letters
    Jul 18, 2018 · Ptychography has been shown to be an efficient phase contrast imaging technique for scanning transmission electron microscopes (STEM).<|separator|>
  78. [78]
    Deep learning at the edge enables real-time streaming ... - Nature
    Nov 3, 2023 · We demonstrate a workflow that leverages artificial intelligence at the edge and high-performance computing to enable real-time inversion on X-ray ptychography ...
  79. [79]
    Low-dose cryo-electron ptychography of proteins at sub-nanometer ...
    Sep 14, 2024 · Here we apply 4D-STEM and ptychographic data analysis to frozen hydrated proteins, reaching sub-nanometer resolution 3D reconstructions.Results · Ptychography Of Cryo-Em... · 4d-Stem Ptychography...Missing: fine- | Show results with:fine-<|separator|>