Fact-checked by Grok 2 weeks ago

Delaunay refinement

Delaunay refinement is a family of algorithms in for generating high-quality unstructured meshes of triangles or tetrahedra by iteratively inserting vertices into a or tetrahedralization to eliminate poorly shaped elements, ensuring boundary conformity and suitability for numerical simulations such as finite element methods. Developed by researchers including L. Paul Chew, Jim Ruppert, and Jonathan Richard Shewchuk starting in the late , these algorithms address key challenges in , including control over element angles, edge lengths, and overall mesh grading, while guaranteeing termination under specified conditions. In two dimensions, Delaunay refinement typically begins with a of a planar straight line graph (PSLG) representing the input domain's , then refines it by adding vertices at the circumcenters of "skinny" triangles—those with a circumradius-to-shortest-edge ratio exceeding a —or at midpoints of encroached segments to prevent short edges. Prominent variants include Ruppert's algorithm, which achieves a minimum bound of approximately 20.7° and ensures the is nicely graded and size-optimal for input angles of at least 30°, and Chew's second algorithm, which improves the bound to about 30° with off-center point placement for better grading even when small input angles are present (down to 26.57°). Both approaches leverage the empty property of Delaunay triangulations to maintain quality, with theorems proving no shorter than a of the local feature size (the of the largest empty touching the input ) and finite termination, producing meshes with bounded counts proportional to the input complexity. Extending to three dimensions, the technique refines tetrahedral meshes by inserting vertices at circumcenters of poorly shaped tetrahedra or midpoints of encroached subsegments and subfacets, using concepts like diametral lenses for subsegments and equatorial lenses or spheres for subfacets to control vertex placement and avoid excessive mesh density. Shewchuk's 3D algorithm bounds the circumradius-to-shortest-edge ratio at 1.24 for well-graded meshes or up to 2 using equatorial spheres, ensuring no sliver tetrahedra in many cases but acknowledging slivers as a persistent challenge without guaranteed elimination. Termination is proven for ratios greater than 1.22 (with lenses) or 2 (with spheres), assuming input dihedral angles of at least 30° between facets, and the resulting meshes are applied in simulations of complex geometries, such as earthquake modeling or , where high aspect ratios can lead to numerical instability. Practical implementations, such as Shewchuk's software for 2D and extensions like TetGen for 3D, incorporate robust geometric predicates using adaptive exact arithmetic to handle floating-point precision issues, enabling reliable computation even for non-convex or multiply-connected domains. While effective for domains without extremely small input angles, the algorithms may require heuristics like "The Quitter" for cases with acute angles below 90°, trading some quality guarantees for termination. Overall, Delaunay refinement has become a cornerstone for quality , influencing subsequent in adaptive and parallel meshing techniques.

Background

Delaunay Triangulation Basics

A of a finite set of points in the plane is a in which the of every triangle contains no other points from the set in its interior. This defining characteristic is known as the empty property. The property ensures that the maximizes the minimum angle over all possible triangulations of the point set. The concept originates from the work of Russian mathematician Boris Delaunay, who introduced it in 1934 in the context of empty spheres in higher dimensions, with the planar case following as a dual to Voronoi diagrams. Common methods for constructing a Delaunay triangulation include incremental insertion, edge flipping, and divide-and-conquer. In incremental insertion, an initial triangulation (often a large enclosing triangle) is built, and points are added sequentially: the containing triangle is located, split by connecting the new point to its vertices, and then adjacent edges are tested and flipped if they violate the empty circumcircle property. Edge flipping starts from any valid triangulation and iteratively replaces diagonals in quadrilaterals to enforce the property until no violations remain. Divide-and-conquer algorithms recursively triangulate subsets of points separated by a line, then merge the results by adding crossing edges that satisfy the Delaunay criterion. Steiner points are auxiliary vertices added to the original point set during to improve properties like element shape or size distribution; the augmented set is then triangulated to yield a Delaunay structure that preserves the empty property. A basic outline for incremental construction from a point set P, assuming a quad-edge for efficient updates, follows the approach of Guibas and Stolfi:
Initialize T as a large [triangle](/page/Triangle) enclosing all points in P
For each point p in P (in random order):
    Locate the triangle Δ in T containing p (e.g., via walking from a known triangle)
    If p lies on an edge of Δ, [split](/page/Split) the edge and adjacent triangles accordingly
    [Split](/page/Split) Δ into three triangles by adding edges from p to Δ's [vertices](/page/Vertex)
    Mark the three original edges of Δ as suspect
    While there are suspect edges e:
        Let e separate [quadrilateral](/page/Quadrilateral) Q with triangles Δ1 and Δ2
        If the [vertex](/page/Vertex) opposite to e in Δ1 lies inside the [circumcircle](/page/Circumcircle) of Δ1:
            Flip e to the opposite edge e' of Q and mark the four boundary edges of the new triangles as suspect
        Else:
            Remove e from suspects
Return T
This process runs in expected O(n \log n) time with randomized insertion order.

Mesh Quality in

Initial triangulations of polygonal domains frequently yield skinny triangles with small angles, which introduce numerical instability in simulations by ill-conditioning stiffness matrices and impeding solver convergence. These poor elements also amplify errors in solutions, necessitating quality improvements for reliable computational results. High-quality meshes are evaluated using key metrics such as minimum bounds, which target no smaller than 30° to promote robustness, (longest edge divided by shortest altitude, ideally bounded above by 6), and radius-edge ratio (circumradius to shortest edge, kept under 2 to avoid excessive elongation). These measures ensure element shapes support accurate and gradient estimation in applications like and . In three dimensions, meshing challenges intensify with the formation of sliver tetrahedra—flat elements exhibiting near-zero volume and dihedral angles approaching 0° or 180°, despite favorable radius-edge ratios around 1.15, which similarly undermine . Input polygons featuring small angles or intricate features further complicate this, as they can trigger cascading poor elements in initial meshes without intervention. Delaunay refinement mitigates these deficiencies through iterative insertion of points, systematically removing substandard triangles or tetrahedra while preserving conformity to the domain boundaries and size constraints. This process leverages the Delaunay property's tendency to maximize minimum angles as a starting point for quality enhancement. The push for such refinements traces to early efforts in the 1980s, when grappled with uncontrolled element shapes in polygonal domains, culminating in the first theoretical guarantees for non-obtuse triangulations (angles ≥13°) by , Grosse, and Rafferty in 1988.

Chew's Algorithms

First Algorithm

L. Paul Chew introduced the earliest Delaunay refinement algorithm in his 1989 technical report on generating guaranteed-quality triangular meshes for polygonal domains. The method focuses on improving mesh quality by iteratively eliminating poorly shaped triangles while preserving the Delaunay property and conforming to boundary constraints. The core mechanism starts with an initial of the input polygon, which incorporates the boundary edges. It then identifies "skinny" triangles—those containing an angle smaller than 30 degrees—and inserts the circumcenter of each such triangle as a new . Following insertion, the mesh is locally retriangulated to restore the Delaunay criterion, ensuring that no triangle's contains another in its interior. This process targets obtuse or acute skinny triangles, promoting more equilateral shapes over iterations. To handle boundaries and ensure mesh conformity, the algorithm checks for encroachments: if a proposed circumcenter lies too close to (or on the wrong side of) a boundary segment—defined as encroaching if the segment intersects the circumcircle or lies within a specified distance—the point is snapped to the midpoint of that segment or the nearest edge point. This snapping maintains the integrity of the input polygon's edges without introducing non-conforming elements. The refinement loop terminates after a finite number of iterations, as each insertion increases the minimum angle in the and the process cannot continue indefinitely without violating the 30-degree threshold. However, the algorithm provides no strict global bound on the smallest angles produced; in worst-case scenarios, angles as small as 1 can appear, particularly near sharp input features. A key limitation is the algorithm's inability to bound small angles adjacent to input vertices or edges with tiny preexisting angles, potentially leading to suboptimal quality in those localized regions despite overall improvements. The iterative refinement can be outlined in pseudocode as follows:
Initialize: Compute [constrained Delaunay triangulation](/page/Constrained_Delaunay_triangulation) T of the input polygon P.

While T contains a skinny [triangle](/page/Triangle) (min [angle](/page/Angle) < 30°) or an encroached segment:
    If there is an encroached boundary segment s:
        Compute midpoint m of s (or snap point to edge if needed).
        Insert m as a new vertex and retriangulate locally.
    Else, select a skinny [triangle](/page/Triangle) Δ:
        Compute circumcenter c of Δ.
        If c encroaches on a boundary segment:
            Snap c to the nearest point on the boundary (e.g., midpoint).
        Insert c (or snapped point) as a new vertex.
        Retriangulate the cavity formed by triangles sharing c's Voronoi cell to maintain Delaunay property.

Output: Refined triangulation T with no skinny triangles.
This loop processes elements in a queue to manage efficiency, prioritizing encroached segments to avoid unnecessary interior insertions near boundaries.

Second Algorithm

In 1993, L. Paul Chew introduced a refined algorithm as an advancement over his earlier 1989 method, specifically addressing limitations such as the creation of short edges and small angles near boundaries through improved point placement strategies. This update, analyzed in detail by Shewchuk, focuses on generating high-quality triangular meshes for planar domains while supporting graded sizing functions to allow variable mesh density based on local feature sizes. The key innovation lies in off-center point placement, eschewing circumcenters in favor of midpoints along short edges or carefully chosen offsets within skinny triangles to prevent the formation of tiny angles and ensure more equitable splitting. The refinement rule targets triangles containing small angles less than 30°, inserting new vertices to subdivide them into better-proportioned elements; if a proposed circumcenter falls on the opposite side of a protected subsegment, it is rejected, and instead, the encroaching subsegment is split at its midpoint. This approach also incorporates a sizing function to control element sizes relative to the local feature size, promoting efficient meshes without excessive refinement in regions of large features. Boundary handling is enhanced to protect small input angles, typically those less than 60°, by avoiding insertions that would disrupt geometric features; encroaching vertices within the of a subsegment are deleted, and splits occur only as needed to maintain the constrained property without excessive refinement near boundaries. The algorithm terminates after a finite number of iterations, proven by bounding the minimum distance between new points and ensuring no edges shorter than a prescribed minimum length are created. It guarantees a minimum angle of at least 30° in the output mesh, barring smaller input angles, providing stricter quality bounds than the first algorithm. For instance, consider a skinny triangle adjacent to a boundary segment with a small input angle; the algorithm identifies the small angle (e.g., <30°), rejects a circumcenter outside the domain, and instead places a new point at the midpoint of the shortest encroaching edge, splitting the triangle into two more equilateral-like elements while preserving the boundary constraint. This insertion refines the local mesh quality without introducing slivers or over-refining nearby features.

Ruppert's Algorithm

Motivation and Objectives

In 1995, Jim Ruppert introduced a algorithm aimed at generating high-quality two-dimensional triangular meshes for computational geometry applications, particularly those involving where poor mesh quality can degrade numerical stability and convergence rates. The work, detailed in his paper "A Delaunay Refinement Algorithm for Quality 2-Dimensional Mesh Generation," addresses the challenge of producing meshes that conform to complex input geometries while ensuring triangles have bounded aspect ratios to minimize computational overhead. Prior refinement techniques, such as L. Paul Chew's 1989 constrained algorithm and his 1993 guaranteed-quality refinement algorithm, sought to produce meshes with all angles at least 30° but suffered significant limitations in handling small input angles or sharp features. These approaches often resulted in the creation of unbounded skinny triangles near boundaries or narrow regions, as they enforced uniform triangle sizing without sufficient local adaptation, leading to inefficient meshes with excessive vertices in low-resolution areas. Ruppert's algorithm builds on these by introducing variable triangle sizes to overcome such failures, prioritizing robustness over the simplicity of Chew's uniform refinement strategies. The primary objectives of Ruppert's method are to generate conforming Delaunay triangulations of polygons, including those with holes and interior segments, such that every triangle has a minimum angle of approximately 20.7° and the total number of added Steiner points is within a constant factor of the optimal minimum. It also aims to provide theoretical guarantees, including proof of termination, prevention of newly introduced angles smaller than approximately 20.7°, and effective management of input angles below 30° through encapsulation, such as by "lopping" small corners to avoid propagating poor quality; for input angles ≥60°, closer to 30° bounds are achievable.

Algorithm Steps

Ruppert's algorithm takes as input a planar straight-line graph (PSLG), consisting of vertices and non-intersecting segments that define the domain boundaries and any internal features to be preserved in the triangulation. The algorithm initializes by adding a large bounding square around the PSLG to enclose the domain and prevent boundary effects during refinement, then computes an initial constrained Delaunay triangulation of the PSLG's vertices while ensuring all input segments are respected as unions of triangulation edges. The core refinement operates in an iterative loop that continues until no bad triangles or encroached segments remain. First, the algorithm scans for encroached segments—those where an existing vertex lies inside the segment's diametral circle (the circle with the segment as diameter)—and splits each such segment at its midpoint by inserting a new vertex, updating the vertex set, segment list, and accordingly; this protects input segments by subdividing them as needed to prevent future encroachments. Next, it identifies bad triangles, defined as those with a circumradius-to-shortest-edge ratio exceeding 1 (targeting interior angles below 30°). For each such triangle, the circumcenter is computed as a candidate for insertion. However, insertion follows strict rules: the point is not added if it lies too close to a boundary segment (within a small distance ε, often tied to the local feature size to avoid sliver-like elements near boundaries) or if placement would create tiny angles smaller than a threshold; in these cases, the nearest or encroaching segment is split at an appropriate point instead, such as its midpoint or a offset location to maintain angle bounds. If no violations occur, the circumcenter is inserted, splitting the bad triangle into three new ones and updating the triangulation. When short segments arise during splitting, the algorithm adds vertices to these subsegments as necessary to further prevent encroachment and ensure the mesh remains well-shaped, prioritizing segment protection over unrestricted point insertion. The iterative process can be expressed in pseudocode as follows:
Algorithm RuppertDelaunayRefine(PSLG, ratio_max ≈ 1.41)
Input: PSLG (vertices V, segments S), maximum r/e ratio_max (for ≈20.7° bound)
Output: Constrained Delaunay triangulation DT with all r/e ≤ ratio_max

// Initialization
Add bounding square B to PSLG
V ← vertices of PSLG ∪ B
S ← segments of PSLG ∪ B
DT ← [ConstrainedDelaunayTriangulation](/page/Constrained_Delaunay_Triangulation)(V)  // Ensures S is respected

// Refinement loop
while exists encroached segment in S or bad triangle in DT do
    // Handle encroached segments first (segment protection)
    for each segment s ∈ S where s is encroached do
        p ← midpoint of s
        if p too close to boundary (< ε) then
            // Adjust split point or split nearest segment instead (to avoid tiny angles)
            split nearest boundary segment appropriately
        else
            insert p into V
            update DT with p
            remove s from S
            add two subsegments s1, s2 to S  // Short segment handling

    // Handle bad triangles
    if exists bad triangle t in DT (r/e(t) > ratio_max) then
        p ← circumcenter(t)
        encroached_segs ← segments encroached by p
        if encroached_segs nonempty or dist(p, [boundary](/page/Boundary)) < ε or would_create_tiny_angles(p) then
            for each si in encroached_segs or nearest boundary:
                split si (as above)
        else
            insert p into V
            update DT with p  // Splits t into three triangles
end while

return DT
During candidate point insertion, the algorithm enforces a radius-edge ratio bound to guarantee angle quality, ensuring the circumradius r and shortest edge length e satisfy \frac{r}{e} < \frac{1}{2 \sin \theta_{\min}}, where the target \theta_{\min} \approx 30^\circ (ratio <1), though the overall guarantee uses ≈1.41 for 20.7°.

Quality Guarantees

Ruppert's Delaunay refinement algorithm is guaranteed to terminate for a planar straight-line graph (PSLG) input, as each Steiner point insertion either improves skinny triangles or splits encroached subsegments, with the total number of insertions bounded by O(n), where n is the input size (number of edges). The proof leverages the local feature size (lfs) function, which measures the distance to the nearest non-local feature of the PSLG, ensuring that added vertices are sufficiently separated (at least lfs/2 apart) and that refinement cannot continue indefinitely without violating geometric constraints. Specifically, no more than O(n) Steiner points are added in the worst case. The algorithm provides a strict angle bound, ensuring that all output triangles have a minimum of at least approximately 20.7°, except for those immediately adjacent to protected input smaller than 30°, where the bound degrades gracefully to depend on the input angle α (e.g., no smaller than arctan( α / (2 - cos α))). This quality arises from the encroach test using a circumradius-to-shortest-edge targeting 1 (for interior 30°), while preserving the Delaunay property and to the PSLG. Non-degeneracy is maintained, producing no zero-area triangles even for inputs with arbitrarily small (down to 0°), achieved through local refinement that splits skinny regions near input features without introducing flat elements. The output mesh complexity remains linear in n, with the number of triangles within a constant factor of the optimal for the given angle bound. However, these guarantees apply only in ; the algorithm struggles with very skinny input features, where termination may require adjustments like off-center placements to avoid cycles of refinement.

Shewchuk's Extensions

Triangle Implementation

Jonathan Shewchuk developed the software in 1996 while at the School of , , as a practical of Ruppert's Delaunay refinement algorithm for generating high-quality two-dimensional triangular meshes, incorporating fixes to enhance numerical robustness. The software applies Ruppert's core refinement rules to produce meshes with well-shaped triangles suitable for finite element methods, while addressing potential issues in that could lead to inconsistencies in Delaunay properties. Triangle supports piecewise linear segment graphs (PSLGs) to define mesh boundaries, including the ability to handle holes and assign regional attributes to enforce varying mesh densities or constraints within the domain. Users can specify options such as a minimum bound (up to approximately 33.8 degrees) to control slenderness and a maximum area to limit element size, ensuring the resulting meets quality criteria for applications like finite element analysis. These features enable the generation of constrained Delaunay triangulations that conform to input boundaries without introducing skinny . A key improvement in is the use of robust geometric predicates for and incircle tests, which employ adaptive exact arithmetic to avoid errors in point-in-circle evaluations critical to Delaunay refinement. This adaptive precision mechanism dynamically increases the arithmetic precision as needed during computations, ensuring exact results even for degenerate cases, and has made the software reliable for large-scale meshing tasks. As a command-line tool and callable written , Triangle accepts input in .poly files describing , segments, holes, and regional attributes, and outputs meshes in .node files (listing coordinates and attributes) and .ele files (specifying connectivity and markers). It also provides statistics, such as histograms of minimum angles and areas, to verify mesh conformity to user-specified bounds. For example, a simple .poly file might define a polygonal with boundary segments, which Triangle refines into a conforming respecting the -q switch for angle . Triangle has had a significant impact, becoming widely adopted in academic research and industrial applications for two-dimensional , and it received the 2003 James Hardy Wilkinson Prize for Numerical Software for its contributions to robust . The software is freely available for non-commercial use under a permissive requiring acknowledgment in publications, with its robust predicates released into the to facilitate integration into other tools.

3D Delaunay Refinement

Jonathan Richard Shewchuk extended Delaunay refinement algorithms to three dimensions in his 1998 paper, introducing a method for generating quality tetrahedral meshes from polyhedral domains defined as piecewise linear complexes (PLCs). The algorithm adapts the 2D approach by iteratively inserting vertices at the circumcenters of poorly shaped tetrahedra, identified by a circumradius-to-shortest-edge exceeding a user-specified bound (typically 2), while protecting input segments and facets from encroachment to ensure boundary conformity. This process maintains a constrained Delaunay tetrahedralization throughout refinement, handling non-convex polyhedra by first recovering the input boundary through segment and facet insertion. A primary challenge in 3D meshing is the persistence of sliver tetrahedra—flat, needle-like elements with small dihedral angles (often near zero) and low volume, despite potentially favorable circumradius-to-shortest-edge ratios (e.g., approximately \sqrt{2}/2 \approx 0.707 for kite slivers). Unlike skinny triangles in 2D, slivers are harder to eliminate due to their occurrence in non-obtuse configurations, and the algorithm addresses them by targeting tetrahedra with poor dihedral angles in addition to the radius-edge metric, though without provable elimination guarantees. The quality of a tetrahedron \tau is measured by the radius-edge ratio \rho(\tau) = R / l_{\min}, where R is the circumradius and l_{\min} is the shortest edge length; refinement ensures \rho(\tau) \leq 2 for most elements, implying no face angles smaller than about 14.5° and dihedral angles bounded below 165° in practice. To further improve mesh quality, Shewchuk's extensions incorporate post-refinement enhancements such as vertex smoothing via hill-climbing optimization and /face , which relocate vertices to minimize maximum angles without violating the Delaunay property. These variational techniques enhance sliver removal empirically, often achieving angles between 20.7° and 150°, though no strict theoretical bounds on minimum angles exist due to boundary constraints. Termination is guaranteed if the bound exceeds 2 and input angles are at least 90°, but the algorithm handles general polyhedral inputs robustly by detecting sharp features and preserving necessary poor elements near boundaries. These 3D refinement techniques are implemented in TetGen, a C++ software library and program developed by Hang Si as a 3D counterpart to Shewchuk's software, supporting features like volume constraints and output in various formats for finite applications.

Applications

Finite Element Analysis

Delaunay refinement integrates seamlessly with finite element methods (FEM) by producing high-quality unstructured meshes that ensure well-shaped , thereby reducing the of stiffness matrices and enhancing solver accuracy. In FEM simulations, poorly shaped with small angles lead to ill-conditioned stiffness matrices, which can amplify numerical errors and slow in iterative solvers; refinement algorithms bound minimum angles (e.g., at least 20.7° in two dimensions), mitigating these issues and improving overall matrix conditioning. This quality control is particularly beneficial for adaptive FEM, where meshes evolve to capture solution features without introducing artifacts from degenerate . Key advantages include avoidance of volumetric locking in nearly incompressible materials and accelerated convergence in adaptive schemes. For incompressible or nearly incompressible simulations, such as those in , skinny elements exacerbate locking by over-constraining volumetric changes; Delaunay-refined with bounded aspect ratios and angles help maintain flexibility in the approximation space, reducing artificial . In adaptive methods, the graded sizing from refinement ensures efficient h-adaptation, focusing computational effort on regions of high gradients while preserving global mesh quality. A typical workflow begins with generating a refined Delaunay mesh conforming to the domain geometry, followed by applying boundary conditions and assembling the stiffness matrix to solve the governing partial differential equations (PDEs). This process leverages the mesh's conformity to input segments or surfaces, ensuring accurate representation of boundaries before proceeding to FEM discretization and solution. In case studies, Delaunay refinement has been applied to meshing for , such as solving the Navier-Stokes equations around airfoils, where angle-bounded triangles prevent errors from thin boundary layers. Similarly, in , it supports simulations of propagation in complex basins, avoiding inaccuracies from small angles that distort stress distributions. Recent applications include flood risk assessment for multi-bridge systems and adaptive mesh refinement for electromagnetic simulations. Historically, Delaunay refinement gained adoption in the within software and custom codes for FEM, following seminal developments like Ruppert's algorithm in 1995. Error estimates in FEM interpolation on these meshes are tied to element angles, with the interpolation error for linear elements bounded by O(h^2 / \sin \theta_{\min}), where h is the element size and \theta_{\min} is the minimum angle; this dependence underscores how refinement's angle guarantees sharpen convergence rates compared to unrefined meshes.

Practical Implementations

Delaunay refinement has been integrated into several open-source and commercial software tools for mesh generation in computational simulations. Gmsh, an open-source finite element mesh generator, employs a multi-threaded 3D Delaunay-based refinement algorithm that supports parallel processing for large-scale meshes, enabling efficient generation of high-quality tetrahedral elements. Netgen and CGAL incorporate ideas from Shewchuk's robust Delaunay refinement, such as exact arithmetic predicates to ensure numerical stability during vertex insertion and triangulation maintenance. In commercial environments, Altair HyperMesh utilizes Delaunay triangulation for surface meshing, maximizing minimum angles in triangles to avoid high aspect ratios and slivers, which is particularly useful for automotive and aerospace applications. Practical deployment faces key challenges, including numerical precision issues arising from , which can lead to failures in satisfying the strict Delaunay criterion and produce incorrect triangulations. Handling large inputs, such as domains requiring millions of elements, exacerbates these problems by increasing computational demands and the risk of accumulated rounding errors during iterative refinement. Optimizations address these issues through parallel refinement techniques, which decouple subproblems to achieve guaranteed quality while maintaining concurrency across processors. Adaptive stopping criteria, often based on error estimators from finite element analysis, allow refinement to halt when mesh quality or size meets application-specific thresholds, reducing unnecessary vertex insertions. Best practices emphasize pre-processing input to mitigate degenerate cases, such as collinear or coplanar points, by applying robust elimination procedures that prepare meshes for reliable simulations. Post-processing steps, including Laplacian , further improve element quality by relocating vertices while preserving boundary conformity, especially in parallel settings for large tetrahedral meshes. In terms of , Delaunay refinement typically achieves O(n log n) for constructing constrained triangulations of input size n, with overall extending to O(n log n + m) where m is the output size, making it suitable for complex geometries. For instance, it has been applied to mesh intricate structures like components, generating unstructured grids for aerodynamic simulations with bounded aspect ratios and near-optimal element counts. Recent advances since 2000 include methods that combine Delaunay refinement with octree-based spatial partitioning for faster point location in large domains and advancing front techniques for boundary-conforming , enhancing efficiency in generating hybrid structured-unstructured grids. These approaches, such as constrained Delaunay with advancing front, improve robustness for complex CAD models while controlling anisotropy.

References

  1. [1]
    [PDF] Delaunay Refinement Algorithms for Triangular Mesh Generation
    May 21, 2001 · [29] Jonathan Richard Shewchuk. Triangle: Engineering a 2D Quality Mesh Generator and Delaunay Tri- angulator. Applied Computational ...
  2. [2]
    [PDF] Delaunay Refinement Mesh Generation
    May 18, 1997 · In three dimensions, I introduce a Delaunay refinement algorithm that can produce tetrahedral meshes that are nicely graded and whose tetrahedra ...
  3. [3]
    Computing Constrained Delaunay Triangulations
    Equivalently, all triangles in the Delaunay triangulation for a set of points will have empty circumscribed circles. That is, no points lie in the interior of ...
  4. [4]
    [PDF] CMSC 754: Lecture 12 Delaunay Triangulations: General Properties
    Circumcircle property: The circumcircle of any triangle in the Delaunay triangulation is “empty,” that is, the interior of the associated circular disk ...
  5. [5]
    [PDF] Lecture Notes on Delaunay Mesh Generation - People @EECS
    Sep 20, 1999 · The Delaunay triangulation D of V , introduced by Delaunay [20] in 1934, is the graph defined as follows. Any circle in the plane is said to be ...Missing: origin | Show results with:origin
  6. [6]
    [PDF] A PARALLEL APP - FSU Digital Repository
    Delaunay Triangulations have been around for several decades now, they were first introduced by Boris Delone in 1934 in the paper Sur la sph`ere vide (On the ...
  7. [7]
    [PDF] 29 TRIANGULATIONS AND MESH GENERATION - CSUN
    There are a number of practical planar DT algorithms [For95], including edge flip- ping, incremental construction, plane sweep, and divide and conquer. The ...
  8. [8]
    [PDF] Mesh generation and optimal triangulation - UC Irvine
    In practice, mesh generation in- variably allows Steiner points, although the placement of Steiner points is often separate from the subsequent triangulation ...
  9. [9]
    [PDF] Primitives for the Manipulation of General Subdivisions and the ...
    Both are based on the use of the Voronoi dual, or Delaunay triangulation, and are simple enough to be of practical value. The simplicity of both algorithms can ...
  10. [10]
    [PDF] A Delaunay Re nement Algorithm for Quality 2-Dimensional Mesh ...
    The technique we use|successive re nement of a Delaunay triangulation|extends a mesh generation ... allow triangulations to contain Steiner points|vertices of the ...
  11. [11]
    [PDF] Tetrahedral Mesh Generation by Delaunay Refinement "! +
    A Delaunay Refinement Algorithm for Qual- ity 2-Dimensional Mesh Generation. Journal of Algorithms. 18(3):548–585, May 1995. [14] Jonathan Richard Shewchuk.
  12. [12]
    [PDF] Lecture Notes on Delaunay Mesh Generation - People @EECS
    Feb 5, 2012 · The added vertices are often called Steiner points, and the mesh is called a Steiner triangulation of the polygon. Stepping into three ...
  13. [13]
    Nonobtuse Triangulation of Polygons. - EuDML
    Baker, B.S., Grosse, Eric, and Rafferty, Conor, S.. "Nonobtuse Triangulation of Polygons.." Discrete & computational geometry 3.1-2 (1988): 147-168.Missing: Trefethen mesh generation
  14. [14]
    [PDF] Guaranteed-Quality Triangular Meshes - Sandia National Laboratories
    Apr 4, 1989 · Chew, Constrained Delaunay triangulations, Algorithmica, 4. (1989), 97-108. [Ch89b] L. P. Chew, There are planar graphs almost as good as the ...
  15. [15]
    Guaranteed-quality mesh generation for curved surfaces
    L. P. Chew, Guaranteed-Quality Triangular Meshes, Department of Computer Science Tech Report TR 89-983, Cornell University, 1989. Crossref.
  16. [16]
    A Delaunay Refinement Algorithm for Quality 2-Dimensional Mesh ...
    May 1995, Pages 548-585. Journal of Algorithms. Regular Article. A Delaunay Refinement Algorithm for Quality 2-Dimensional Mesh Generation. Author links open ...
  17. [17]
    Constrained delaunay triangulations | Algorithmica
    Nov 27, 1987 · Constrained delaunay triangulations. Published: June 1989. Volume 4, pages 97–108, (1989); Cite this article. Download PDF · Algorithmica Aims ...
  18. [18]
    [PDF] Delaunay refinement in the plane - CS@Purdue
    The main insight behind all Delaunay refinement algorithms is that they constrain how close together two vertices can be, and thus constrain how short an edge ...
  19. [19]
    [PDF] Engineering a 2D Quality Mesh Generator and Delaunay Triangulator
    Guaranteed-quality meshes (having no small an- gles) are generated using Ruppert's Delaunay refinement algorithm. Features include user-specified constraints on.
  20. [20]
    Triangle: A Two-Dimensional Quality Mesh Generator and Delaunay ...
    Triangle generates exact Delaunay triangulations, constrained Delaunay triangulations, conforming Delaunay triangulations, Voronoi diagrams, and high-quality ...How fast is Triangle? · Demonstration · Command line switches
  21. [21]
    Tetrahedral mesh generation by Delaunay refinement
    TetGen, a Delaunay-Based Quality Tetrahedral Mesh Generator. TetGen is a C++ program for generating good quality tetrahedral meshes aimed to support numerical ...
  22. [22]
    [PDF] Constrained Delaunay Tetrahedralizations and Provably Good ...
    In two dimensions, a constrained Delaunay triangulation (CDT) respects a set of segments that constrain the edges of the triangula-.
  23. [23]
    [PDF] manual.pdf - TetGen
    The basic algorithm for generating quality tetrahedral meshes is the De- launay refinement algorithm from Ruppert [13] and Shewchuk [17]. This algorithm ...
  24. [24]
    Improved robustness for nearly-incompressible large deformation ...
    Aug 7, 2025 · A displacement-based Galerkin meshfree method for large deformation analysis of nearly-incompressible elastic solids is presented.
  25. [25]
    Delaunay based triangulations for the Navier-stokes equations with ...
    Aug 10, 2025 · A fully automatic grid generation algorithm has been developed that can generate highly stretched unstructured meshes for Navier-Stokes ...
  26. [26]
    [PDF] A Delaunay Refinement Algorithm for Quality 2-Dimensional Mesh ...
    The Delaunay refinement algorithm triangulates polygons, ensuring bounded aspect ratios and a number of triangles close to optimal, and is simpler than ...Missing: metrics | Show results with:metrics
  27. [27]
    [PDF] Meshing User's Guide - Ansys Help
    The software products and documentation are furnished by ANSYS, Inc., its subsidiaries, or affiliates under a software license agreement that contains ...
  28. [28]
    [PDF] What Is a Good Linear Finite Element? Interpolation, Conditioning ...
    Dec 31, 2002 · For finite element methods, discretization errors and the condition number of the global stiffness matrix are important too. The coming pages ...
  29. [29]
    [PDF] on optimal interpolation triangle incidences - UC Davis Math
    Strang and Fix [15] have developed an error bound that depends on the reciprocal of the sine of the minimum interior angle.
  30. [30]
    [PDF] Towards (very) large scale finite element mesh generation with Gmsh
    • The new 3D Delaunay-based algorithm is multi-threaded using a fine-grained approach. You need to recompile Gmsh with -DENABLE OPENMP=1 to enable this; then ...
  31. [31]
    [PDF] 1 CGALmesh: a Generic Framework for Delaunay Mesh Generation
    The Delaunay refinement phase gradually inserts new vertices into the triangulation, guided by a set of criteria. Finally, the optimization phase aims at ...
  32. [32]
    Delaunay Meshing
    Delaunay triangulation is used to maximize each triangle's minimum angle. This produces a mesh with no high aspect ratio or slither triangles.
  33. [33]
    [PDF] An approach of using Delaunay refinement to mesh continuous ...
    Jun 5, 2017 · 2.2.3 Numerical precision issues​​ As implied by the strictness of the Delaunay criterion, non-exact arithmetic such as floating point numbers ...
  34. [34]
    [PDF] Repairing and meshing imperfect shapes with Delaunay refinement
    Due to numerical problems, imprecise design, software id- iosyncrasies, or data exchange issues, the surface patches produced at the CAD step may abut ...<|separator|>
  35. [35]
    Delaunay Decoupling Method for Parallel Guaranteed Quality ...
    In this paper we present a parallel guaranteed quality Delaunay method for 2-dimensional domains which is based on the complete decoupling of the subproblems.
  36. [36]
    2D Delaunay mesh adaptation under domain changes - ScienceDirect
    This paper presents a framework that makes a combination of mesh improvement and Delaunay refinement techniques which benefits the adaptation of a mesh under ...
  37. [37]
    [PDF] A Robust Procedure to Eliminate Degenerate Faces from Triangle ...
    We presented a robust algorithm to remove both types of degenerate faces from triangle meshes in order to prepare them for reliable numerical simu- lations.
  38. [38]
    Designing Parallel Adaptive Laplacian Smoothing for Improving ...
    Jun 15, 2021 · We specifically designed a parallel adaptive Laplacian smoothing algorithm for improving the quality of large-scale tetrahedral meshes.
  39. [39]
    Reprint of: Delaunay refinement algorithms for triangular mesh ...
    This article presents an intuitive framework for analyzing Delaunay refinement algorithms that unifies the pioneering mesh generation algorithms of L. Paul ...
  40. [40]
    [PDF] an algorithm for parallel unstructured mesh generation and flow ...
    Feb 2, 1996 · The Delaunay triangulation of a point set is then formed by considering any point pair with a common Voronoi boundary segment and joining them ...
  41. [41]
    Full article: A novel approach for large-scale hybrid mesh generation ...
    Nov 29, 2024 · The advancing front method is another mainstream option in unstructured mesh generation, which is based on a local heuristic strategy (Lo ...
  42. [42]
    On the Efficiency of the Advancing-Front Surface Mesh Generation ...
    The improved AFT algorithm uses a fast projection scheme, ideal point calculation, and octree data structure, achieving 100,000 elements/s and 1.6 million in 7 ...
  43. [43]
    Hybrid meshing using constrained Delaunay triangulation for ...
    Apr 12, 2016 · This approach can be combined with the advancing front method 4-6, the Delaunay insertion method 7-10 or the combination of the two 11-13.