Fact-checked by Grok 2 weeks ago

Distance


Distance is a scalar quantity representing the length of the shortest path between two points in a space, fundamental to geometry, physics, and measurement. In Euclidean geometry, the distance between two points (x_1, y_1) and (x_2, y_2) in the plane is calculated as d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2}, derived from the Pythagorean theorem. This extends to higher dimensions and forms the basis for metric spaces, where a distance function must satisfy non-negativity, symmetry, the identity of indiscernibles, and the triangle inequality: d(x,z) \leq d(x,y) + d(y,z). In physics, distance quantifies the total path length traversed by an object, distinguishing it from displacement, which accounts for direction as a vector. Applications span navigation, such as great-circle distances on spheres for aviation routes, to abstract spaces in data analysis and computer science. Variations like Manhattan distance, which sums absolute differences along axes, arise in contexts prioritizing grid-like paths over straight lines.

Definition and Historical Context

Core Definition and Intuition

Distance, in its most fundamental sense, quantifies the extent of spatial separation between two points or objects, representing the of the connecting them. In physics, it is defined as a scalar measuring the total ground covered by an object during motion, independent of . This contrasts with , which accounts for the straight-line from initial to final position, highlighting distance's path-dependent nature. For instance, an object traveling 5 kilometers eastward and then 5 kilometers westward covers a distance of 10 kilometers, though its is zero. The intuitive core of distance arises from everyday experience: it gauges "how far" entities are apart, enabling navigation, estimation of travel time, and comprehension of scale in the physical world. In , this intuition formalizes as the straight-line length between points, derived from the . For two points (x_1, y_1) and (x_2, y_2) in a , the distance d is d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2}, embodying the shortest path in flat space. This measure underpins calculations in surveying, engineering, and basic kinematics, where empirical verification through tools like rulers or odometers confirms its accuracy. Mathematically, distance extends beyond physical paths to abstract spaces via metric functions, which assign non-negative values to pairs of elements, satisfying properties like symmetry and the triangle inequality. The Euclidean metric serves as the prototypical example, capturing the direct, causal separation in observable reality, while deviations in non-Euclidean contexts reveal how geometry influences perceived distances. This foundational concept drives empirical sciences by linking observable separations to quantifiable models, ensuring predictions align with measured outcomes.

Historical Evolution of the Concept

The concept of distance originated in practical necessities of ancient civilizations, where it was quantified using anthropometric units derived from human anatomy to facilitate , , and navigation. In Sumeria and around 3000–2000 BCE, early systems employed measures such as the —defined as the length from elbow to fingertip, approximately 45–52 cm depending on regional variations—and smaller subdivisions like the or . These units enabled precise for monumental , as evidenced by cubit rods inscribed with markings found in Egyptian tombs, reflecting an empirical approach to linear separation without abstract formalization. By the classical Greek period, around 300 BCE, Euclid's Elements elevated distance from mere to a , implicit in the postulate that a straight line can be drawn between any two points, with determined via constructive proofs and the for right triangles. This framework treated distance as the invariant of the shortest path in flat space, calculable as \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} in coordinate terms later formalized, though Euclid avoided coordinates. engineers extended practical application, using the passus (double , about 1.48 m) and mille passus (thousand paces, precursor to the mile) for road-building, achieving accuracies within 5% over long distances via chained . Medieval and Renaissance efforts focused on standardization amid inconsistent local units, with the 1791 French Academy proposal for the grounding distance in natural invariants like Earth's quadrant meridian (1/10,000,000 defining the meter). In 19th-century mathematics, distance gained rigor through , as in Cauchy's 1821 work on implying bounded separations, paving the way for . The modern formalization emerged in the early , with Maurice Fréchet's 1906 introduction of écart (a semi-metric satisfying non-negativity and symmetry) and Felix Hausdorff's 1914 definition of metric spaces, axiomatizing distance d(x,y) via positivity, symmetry, and the d(x,z) \leq d(x,y) + d(y,z), decoupling it from embedding. This evolution shifted distance from empirical artifact to a foundational structure in and , enabling non-intuitive metrics like those in or function spaces.

Geometric and Physical Distances

Euclidean Distance and Measurement

The Euclidean distance between two points in a plane is the length of the straight-line segment connecting them, computed via the square root of the sum of the squared differences in their Cartesian coordinates. This measure originates from the Pythagorean theorem, which states that in a right triangle, the square of the hypotenuse equals the sum of the squares of the other two sides, directly yielding the distance formula for points (x_1, y_1) and (x_2, y_2) as d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2}. In , the formula extends to include the z-coordinate: d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2 + (z_2 - z_1)^2} for points (x_1, y_1, z_1) and (x_2, y_2, z_2). This generalization preserves the geometric intuition of the shortest path in flat space. For arbitrary n-dimensional , the distance is d(\mathbf{x}, \mathbf{y}) = \sqrt{\sum_{i=1}^n (x_i - y_i)^2}, forming the basis for the \ell_2-norm in vector spaces. Physical measurement of distances relies on instruments calibrated in standardized units assuming local flatness, where effects are insignificant. The , the unit of , is defined as the distance travels in vacuum during 1/299792458 of a second, fixing the at exactly 299792458 m/s. For short ranges, rigid rods or tape measures enforce this metric through material stiffness, approximating straight-line paths. Precision techniques, such as laser interferometry, determine distances by counting interference fringes from coherent , with each fringe corresponding to half a , typically around 532 for green lasers, enabling sub-micrometre accuracy under assumptions.

Non-Euclidean and Curved-Space Distances

Non-Euclidean geometries deviate from Euclidean parallelism and incorporate constant non-zero curvature, altering distance measurements along geodesics rather than straight lines. Elliptic geometry, equivalent to spherical geometry on a unit sphere, defines distance as the great circle arc length between points, given by d = \arccos(\mathbf{p} \cdot \mathbf{q}) for position vectors \mathbf{p} and \mathbf{q} on the sphere. This yields shorter paths than Euclidean chords, with total circumference $2\pi and excess triangle angles summing positively. Hyperbolic geometry, featuring constant negative (often normalized to -1), employs models like the Poincaré disk or upper half-plane for distance computation. In the upper half-plane model, the hyperbolic distance between points z and w satisfies \cosh(d(z, w)) - 1 = \frac{|z - w|^2}{2 \operatorname{Im} z \operatorname{Im} w}. Geodesics appear as circular arcs orthogonal to the boundary, and distances grow exponentially, leading to negative angle excess in triangles. In broader curved spaces modeled by Riemannian manifolds, distances arise from a g assigning inner products to spaces. The of a \gamma: [a,b] \to M is \int_a^b \sqrt{g_{\gamma(t)}(\gamma'(t), \gamma'(t))} \, dt, and the distance between points p and q is the infimum of such lengths over connecting curves. This generalizes non- cases, where curvature tensor components dictate from norms, enabling precise measurement on manifolds like surfaces of revolution. Local approximation holds via the , but global distances reflect intrinsic geometry.

Relativistic and Cosmological Distances

In , the L_0 of an object is defined as the distance between its endpoints measured simultaneously in the object's , remaining invariant for all observers under . For an observer moving at velocity v parallel to the object's length, the measured length L contracts according to L = L_0 \sqrt{1 - v^2/c^2}, where c is the , reflecting the and the interval's invariance. This contraction applies only to the dimension parallel to the motion, with perpendicular dimensions unaffected, as derived from the of coordinates./28%3A_Special_Relativity/28.03%3A_Length_Contraction) In , distances are frame-dependent and path-dependent due to spacetime curvature, with proper distance computed as the length of a spacelike —the shortest path between two events in the curved geometry defined by the g_{\mu\nu}. The proper distance s along such a curve is given by s = \int \sqrt{-ds^2}, where ds^2 = g_{\mu\nu} dx^\mu dx^\nu and the negative sign distinguishes spacelike intervals from timelike ones; this accounts for gravitational effects like those near massive bodies, where deflection and Shapiro delay alter measured paths. Free-falling observers follow , but measured distances incorporate the metric's local variations, as confirmed by experiments such as the 1919 solar eclipse observation of starlight bending. Cosmological distances in an expanding , modeled by the Friedmann-Lemaître-Robertson-Walker (FLRW) , distinguish between proper distance (physical separation at a fixed ) and comoving distance (fixed coordinate separation scaled by the factor a(t)). The proper distance D_p to an object at z is D_p = a(t_0) \int_0^z \frac{c \, dz'}{H(z')}, where H(z) is the Hubble parameter at z' and t_0 is the present ; this evolves with time due to cosmic , unlike comoving coordinates which remain fixed. distance D_L, inferred from dimming, satisfies D_L = (1+z) D_M where D_M is the transverse comoving distance, accounting for both and effects on and arrival rate. D_A = D_M / (1+z) relates observed angular size to physical extent, peaking at intermediate redshifts before declining in standard \LambdaCDM models due to the interplay of matter density and dark energy. These measures enable consistency checks via the distance duality relation D_L = D_A (1+z)^2, tested against supernovae and data.

Empirical Measurement Challenges

In terrestrial and , empirical distance measurements encounter random errors from variations in instrument readings or environmental noise, systematic errors from consistent biases such as instrument misalignment or uncompensated , and gross errors from human blunders like misrecording data. Random errors follow probabilistic distributions and can be mitigated through repeated measurements and statistical averaging, while systematic errors require identification and correction via or modeling, as their unaddressed propagation amplifies inaccuracies in networks of interconnected measurements. Techniques like electronic distance measurement (EDM) using infrared or laser pulses suffer from atmospheric refraction, which bends light paths and introduces errors up to several parts per million over kilometer baselines, necessitating real-time corrections based on meteorological data such as temperature, pressure, and humidity. Tape measurements face thermal expansion, sag due to gravity, and tension inconsistencies, with errors scaling quadratically with distance; for a 100-meter invar tape at standard conditions, uncompensated temperature shifts of 1°C can yield offsets of about 0.1 mm. Global Navigation Satellite Systems (GNSS) like GPS achieve sub-meter precision but contend with multipath reflections, ionospheric delays (up to 10-20 meters equivalent path length), and satellite clock drifts, compounded by the need for differential corrections or precise point positioning algorithms. At relativistic speeds, shortens measured distances for objects moving relative to the observer, with the observed length L given by L = L_0 \sqrt{1 - v^2/c^2}, where L_0 is the , v is , and c is the ; this effect, negligible below v \approx 0.1c (about 30,000 km/s), necessitates frame-dependent protocols for high-velocity experiments like particle accelerators. In satellite-based systems such as GPS, general relativistic (clocks run slower in weaker fields) and special relativistic velocity effects combine to require upward adjustments of about 38 microseconds per day to orbital clocks, preventing positional errors exceeding 10 km without correction. Cosmological distance measurements via the accumulate uncertainties across rungs, from trigonometric (limited to ~1 kpc with mission precision of ~0.02% at 100 pc) to standard candles like Type Ia supernovae, where calibration inconsistencies contribute to the Hubble tension—a 4-6 discrepancy between (~73 km/s/Mpc) and CMB-derived (~67 km/s/Mpc) expansion rates, potentially signaling systematic biases in distance assumptions or unmodeled evolution in indicators. Empirical verification remains challenged by light-travel time delays, redshift-distortion confounds, and the inability to directly observe intervening media, demanding cross-validation with multiple independent methods like , which yield concordant but hierarchically dependent results.

Mathematical Formalization

Metric Spaces and Axioms

A formalizes the notion of distance in an abstract setting, consisting of a nonempty set X equipped with a d: X \times X \to [0, \infty) called a metric that satisfies specific axioms. This structure generalizes to arbitrary sets, enabling the study of convergence, , and without reference to embedding in a ./03:_Vector_Spaces_and_Metric_Spaces/3.07:_Metric_Spaces) The concept was introduced by Maurice Fréchet in his 1906 doctoral thesis, where he unified notions from function spaces and point-set topology. The axioms defining a metric d are as follows:
  1. Non-negativity: d(x, y) \geq 0 for all x, y \in X. This ensures distances are , mirroring physical distances.
  2. Identity of indiscernibles: d(x, y) = 0 if and only if x = y. This distinguishes distinct points by positive distance and assigns zero distance to a point with itself./03:_Vector_Spaces_and_Metric_Spaces/3.07:_Metric_Spaces)
  3. Symmetry: d(x, y) = d(y, x) for all x, y \in X. This reflects the bidirectional nature of distance in isotropic spaces.
  4. Triangle inequality: d(x, z) \leq d(x, y) + d(y, z) for all x, y, z \in X. This captures the efficiency of direct paths over indirect ones, preventing "shortcuts" that violate intuitive geometry./03:_Vector_Spaces_and_Metric_Spaces/3.07:_Metric_Spaces)
These axioms ensure the metric induces a topology on X, where open sets are unions of open balls defined by B_r(x) = \{ y \in X \mid d(x, y) < r \}. Variations exist, such as pseudometrics (omitting the "if" in the second axiom) or quasimetrics (dropping symmetry), but the standard metric axioms provide the foundational framework for rigorous analysis. Fréchet's formulation in 1906 laid the groundwork for modern functional analysis by abstracting distance from concrete Euclidean or Hilbert spaces.

Specific Distance Functions

The Euclidean distance between two points \mathbf{x} = (x_1, \dots, x_n) and \mathbf{y} = (y_1, \dots, y_n) in \mathbb{R}^n is defined as d(\mathbf{x}, \mathbf{y}) = \sqrt{\sum_{i=1}^n (x_i - y_i)^2}, representing the length of the straight-line path under the Pythagorean theorem generalized to higher dimensions. This metric satisfies the metric axioms, including positivity, symmetry, and the triangle inequality, making it the standard distance in Euclidean spaces for applications in geometry and physics. The Manhattan distance, also called the L_1 norm or taxicab distance, is given by d(\mathbf{x}, \mathbf{y}) = \sum_{i=1}^n |x_i - y_i|, measuring the sum of absolute differences along each coordinate axis. It models paths restricted to axis-aligned movements, as in urban grid navigation, and is less sensitive to outliers than due to the absence of squaring. The Chebyshev distance, or L_\infty norm, is d(\mathbf{x}, \mathbf{y}) = \max_{i=1,\dots,n} |x_i - y_i|, capturing the maximum coordinate difference and corresponding to king moves on a chessboard where diagonal steps are allowed. This metric emphasizes the dominant variation across dimensions and is used in scenarios requiring uniform bounding, such as approximation theory. These L_p norms generalize under the Minkowski distance: d_p(\mathbf{x}, \mathbf{y}) = \left( \sum_{i=1}^n |x_i - y_i|^p \right)^{1/p} for $1 \leq p < \infty, where p=1 yields , p=2 Euclidean, and \lim_{p \to \infty} Chebyshev. The parameter p controls sensitivity to large differences, with higher p approximating the maximum deviation. In discrete spaces, the Hamming distance between two strings or vectors of equal length over a finite alphabet measures the number of positions at which they differ: d(\mathbf{x}, \mathbf{y}) = \sum_{i=1}^n \mathbb{I}(x_i \neq y_i), where \mathbb{I} is the indicator function. Originally for error detection in coding theory, it applies to binary or categorical data, quantifying minimal substitutions needed for equality. Each function induces a metric topology, but their geometries differ: Euclidean preserves angles, while Manhattan and Chebyshev yield diamond- and square-shaped unit balls, respectively, affecting convergence and optimization in algorithms. Selection depends on the space's structure and application, such as clustering where Manhattan mitigates curse-of-dimensionality effects better in high dimensions.

Distances in Graphs and Discrete Structures

In graph theory, the distance d(u, v) between two vertices u and v in a finite graph is defined as the minimum number of edges in any path connecting them, equivalent to the length of a shortest path or geodesic. This definition applies to unweighted graphs, where edge lengths are uniformly 1; in weighted graphs, distances incorporate edge weights as the sum along the shortest path. Graph distances satisfy the metric properties: d(u, u) = 0, d(u, v) > 0 for u \neq v, symmetry d(u, v) = d(v, u) in undirected graphs, and the d(u, w) \leq d(u, v) + d(v, w). If no path exists, the distance is conventionally infinite, rendering the graph disconnected. Key parameters derived from graph distances include the eccentricity of a vertex v, defined as the maximum distance from v to any other vertex, \mathrm{ecc}(v) = \max_{w} d(v, w). The radius of the graph is the minimum eccentricity over all vertices, \mathrm{rad}(G) = \min_v \mathrm{ecc}(v), representing the smallest "reach" from a central vertex. The diameter is the maximum eccentricity, \mathrm{diam}(G) = \max_v \mathrm{ecc}(v), quantifying the graph's overall extent; a graph has a finite diameter if and only if it is connected. These measures are invariant under isomorphism and used to classify graphs, such as trees where the diameter equals twice the radius minus at most 1. Computing shortest-path distances is central to graph analysis, with algorithms like Dijkstra's for non-negative weights, running in O((V+E) \log V) time using priority queues on sparse graphs with V vertices and E edges. In directed graphs, distances may lack symmetry, and in acyclic cases, enables linear-time computation. Applications span network routing, where distances model latency, to for influence propagation. Beyond graphs, distances in discrete structures often adapt shortest-path ideas to combinatorial spaces. In the Q_n on binary strings of length n, the d_H(x, y) counts differing bits, equaling the graph distance and minimum bit flips to transform x to y. This metric underpins error-correcting codes, where minimum distance determines code resilience; for example, the of length 7 has distance 3, correcting 1 error. For sequences of unequal length, the (edit distance) minimizes insertions, deletions, or substitutions (each cost 1) to align strings, computable via dynamic programming in O(mn) time for lengths m, n. It generalizes Hamming for variable-length discrete data, applied in spell-checking and bioinformatics for sequence alignment, though computationally intensive for long strings compared to Hamming's O(n) linearity on fixed lengths. In lattices or posets, distances may use chain lengths or order ideals, preserving discreteness while satisfying metric axioms where possible.

Applied and Abstract Distances

Statistical and Divergence Measures

Statistical distances and divergences quantify dissimilarity between probability distributions, random variables, or samples, serving as tools in , hypothesis testing, and . Unlike geometric distances, many such measures fail to satisfy metric axioms like or the , leading to a distinction between proper distances (which form ) and divergences (which are often asymmetric and non-negative). These measures arise from information-theoretic principles, such as relative , and are grounded in the of logarithmic ratios of densities. The Kullback-Leibler (KL) divergence, also known as relative , measures the inefficiency of approximating one P by another Q, defined for continuous densities as D_{\text{KL}}(P \parallel Q) = \int p(x) \log \frac{p(x)}{q(x)} \, dx, assuming q(x) > 0 wherever p(x) > 0. It is non-negative by , equals zero if and only if P = Q , but is asymmetric—D_{\text{KL}}(P \parallel Q) \neq D_{\text{KL}}(Q \parallel P)—and unbounded, rendering it unsuitable as a . KL divergence originates from , where it quantifies extra bits needed to code samples from P using Q-based codes, and finds applications in and variational inference. The provides a symmetric alternative, defined as H(P, Q) = \sqrt{\frac{1}{2} \int (\sqrt{p(x)} - \sqrt{q(x)})^2 \, dx} = \left( \int (\sqrt{p(x)} - \sqrt{q(x)})^2 \, dx \right)^{1/2}, equivalent to the \ell_2 norm of the difference in square-root densities scaled by \sqrt{2}. Bounded between 0 and 1, it satisfies the and is zero only when P = Q, making it a true on the space of probability measures. Hellinger distance is useful in goodness-of-fit tests and due to its insensitivity to tail behavior compared to and its equivalence in convergence properties to other integral probability metrics. Bhattacharyya distance assesses overlap between distributions via D_B(P, Q) = -\log \int \sqrt{p(x) q(x)} \, dx = -\log BC(P, Q), where BC is the Bhattacharyya coefficient measuring inner product under square-root transformation. It is symmetric, non-negative, but does not satisfy the , though it bounds other divergences like Chernoff information. Applied in and , it ranks variables by separability in tasks. The Jensen-Shannon (JS) divergence symmetrizes via a : JS(P, Q) = \frac{1}{2} D_{\text{[KL](/page/KL)}}(P \parallel M) + \frac{1}{2} D_{\text{[KL](/page/KL)}}(Q \parallel M), with M = \frac{1}{2}(P + Q). Bounded by \log 2, symmetric, and satisfying the , JS forms a and addresses KL's , making it preferable for distribution comparison in scenarios requiring mutual information-like symmetry, such as generative modeling evaluations. Other measures include the total variation distance, \delta(P, Q) = \sup_A |P(A) - Q(A)| = \frac{1}{2} \int |p(x) - q(x)| \, dx, a metric bounding the maximum event probability difference and useful in coupling arguments for convergence rates. These tools enable empirical estimation from samples, though estimators like plug-in densities introduce bias, mitigated by techniques such as kernel density smoothing in high-dimensional settings. Selection depends on properties: divergences like KL for directed information loss, metrics like Hellinger for probabilistic bounds in testing.

Edit and Sequence Distances

Edit distances quantify the minimum number of operations required to transform one into another, serving as a for sequence similarity in and related fields. The , a foundational edit distance, counts insertions, deletions, and substitutions of single characters, each with a unit cost of 1, to convert a source string to a target string. This distance satisfies the properties of a , including non-negativity, symmetry, and the triangle inequality, making it suitable for applications requiring a notion of "closeness" between discrete sequences. Introduced by Vladimir Levenshtein in 1965 in the context of error-correcting binary codes capable of handling deletions, insertions, and reversals, the distance has since been generalized to arbitrary sequences over finite alphabets. Computationally, it is calculated using dynamic programming: for strings X = x_1 \dots x_m and Y = y_1 \dots y_n, a matrix D is constructed where D represents the edit distance between the first i characters of X and first j of Y, with recurrences D{{grok:render&&&type=render_inline_citation&&&citation_id=0&&&citation_type=wikipedia}} = i, D{{grok:render&&&type=render_inline_citation&&&citation_id=0&&&citation_type=wikipedia}} = j, and D = \min(D[i-1] + 1, D[j-1] + 1, D[i-1][j-1] + (x_i \neq y_j)), yielding O(mn) time and space complexity. This algorithm enables efficient alignment of sequences differing by errors or mutations. Variants extend the basic model to capture specific transformations. The Damerau-Levenshtein distance incorporates transpositions of adjacent characters as an additional operation, reducing the distance for common typing errors like swaps. Weighted edit distances assign varying costs to operations, such as higher penalties for substitutions in bioinformatics to reflect biological substitution matrices like . In for DNA or proteins, edit distance principles underpin algorithms like Needleman-Wunsch for global alignment, which minimize a score analogous to edit operations but with gap penalties to model insertions or deletions of variable length. Applications span spell-checking, where dictionaries suggest corrections by ranking words with low to input; plagiarism detection, comparing document fragments via edit operations; and bioinformatics, for aligning genomic sequences to identify evolutionary divergences or mutations. In the latter, edit distances facilitate approximate matching in large databases, though alignment-free alternatives like distances are sometimes preferred for in whole-genome comparisons. These measures prioritize empirical similarity over exact matches, enabling robust handling of noisy or evolving data.

Distances in High Dimensions and Machine Learning

In high-dimensional spaces, common distance metrics such as the exhibit the phenomenon of distance concentration, where the relative differences between pairwise distances diminish as dimensionality increases, rendering nearest-neighbor distinctions unreliable. This arises because, in spaces with thousands of dimensions typical of datasets (e.g., image features or word embeddings), data points tend to lie near the surface of a hypersphere, with most pairwise distances approaching a similar value proportional to the of the dimensionality. Empirical studies confirm that for dimensions exceeding 10-20, the ratio of maximum to minimum distances stabilizes near 1, effectively erasing local structure and amplifying the sparsity of data. This "curse of dimensionality," first formalized by Richard Bellman in 1957 for dynamic programming but extended to metric spaces, fundamentally challenges algorithms reliant on geometric proximity. In , this manifests acutely in unsupervised tasks like clustering (e.g., k-means) and supervised methods like k-nearest neighbors (k-NN), where distances fail to capture meaningful similarities due to inflated volumes and uniform dispersion. For instance, in k-NN on high-dimensional text or genomic data, the "nearest" neighbors may share little semantic or biological relevance, leading to degraded performance as dimensionality grows beyond sample size—a violation of the "blessing of dimensionality" in low dimensions. Peer-reviewed analyses show that -based kernels in support vector machines (SVMs) suffer similar degradation, with error rates increasing exponentially unless mitigated. techniques, such as (PCA) or t-distributed stochastic neighbor embedding (t-SNE), address this by projecting data to lower dimensions while preserving local distances, though t-SNE prioritizes perceptual fidelity over global metric preservation. Alternatives to Euclidean distance, such as cosine similarity—which measures angular separation rather than absolute magnitude—prove more robust in high dimensions, particularly for sparse, normalized vectors in (NLP) embeddings like those from or models. Cosine distances mitigate concentration by ignoring vector lengths, focusing on directional alignment that correlates better with semantic proximity; experiments on datasets with 300+ dimensions (e.g., TF-IDF representations) demonstrate superior retrieval accuracy over L2 norms. (L1) distance occasionally outperforms L2 in sparse settings due to its emphasis on coordinate-wise differences, avoiding the quadratic penalty that exacerbates concentration, though it still requires normalization. Advanced approaches include learned metrics via metric learning (e.g., Mahalanobis distances tuned by large-margin nearest-neighbor loss) or kernel approximations, which embed high-dimensional data into reproducing kernel Hilbert spaces to restore discriminability without explicit reduction. These methods, validated on benchmarks like MNIST extended to synthetic high dimensions, underscore that causal data geometry—rather than raw cardinality—dictates effective distance choice.

Distances in Social and Metaphorical Domains

Psychological and Perceptual Distances

Psychological distance refers to the subjective experience of remoteness from the self in dimensions such as time, space, likelihood, and social relations, originating from the self in the present moment. This concept, central to (CLT), posits that greater psychological distance prompts abstract, high-level construals focusing on core features and desirability, whereas proximal events elicit concrete, low-level construals emphasizing details and feasibility. Developed by Yaacov Trope and Nira Liberman, CLT integrates these distances bidirectionally: manipulations of distance alter construal levels, and construal manipulations shift perceived distance. The four primary dimensions of psychological distance—temporal (future or past events), spatial (physical separation), (perspective of others), and hypothetical (probability of occurrence)—interact to influence and . For instance, distant future events are construed more abstractly, leading to optimistic bias in planning, as individuals prioritize end states over procedural obstacles. Empirical studies demonstrate that increasing psychological distance reduces the impact of incidental emotions like on moral judgments, with distant violations evaluated more leniently than proximal ones. Perceptual distance estimation, by contrast, involves the visual and sensorimotor appraisal of egocentric physical , often distorted by physiological and psychological factors beyond optical cues like size or gradients. Research shows that effort expenditure, such as carrying a heavy , systematically inflates perceived distances to goals like hills or targets, as measured by blind-walking tasks where participants undershoot farther under . and emotional states further modulate these estimates; for example, approach from positive shortens perceived interpersonal distances, while anxiety expands them, as evidenced in experiments using verbal scaling and action-based measures. In virtual environments, perceptual distances are underestimated compared to real-world counterparts, with order of exposure affecting : initial real-world experience yields more accurate virtual estimates than . Cognitive maps from also follow psychophysical power functions, where estimated distances exponentiate actual ones (e.g., exponent ≈1.2-1.5), mirroring direct but with greater compression for larger scales. These distortions highlight that perceptual distance is not a veridical but a functional estimate tuned to action costs, integrating multisensory inputs like and context.

Social and Cultural Distances

Social distance in refers to the degree of sympathetic understanding or emotional closeness perceived between individuals or groups, often manifesting as reluctance to engage in intimate relations such as or close . This concept, formalized by Emory Bogardus in the , quantifies and group acceptance through ordinal assessing willingness to admit out-groups into progressively closer social roles, ranging from citizenship to family membership. The , administered in surveys like the 1926 study of 100 U.S. ethnic groups and replicated nationally in 2005, reveals persistent hierarchies of preference; for instance, that survey found Americans most accepting of other whites (mean score 1.19 on a 1-7 , where 1 indicates minimal distance) but least toward (4.18). Empirical applications extend beyond ethnicity to measure attitudes toward immigrants, religious minorities, and special needs populations, with scales correlating social distance to discriminatory behaviors and policy preferences. A 2013 national update using the Bogardus scale documented reduced distances toward Asians and Catholics since the 1920s but increased wariness of Arabs post-9/11, attributing shifts to historical events rather than inherent traits. Recent adaptations, such as Guttman scaling refinements, enhance sensitivity by incorporating dynamic response options, though critiques note the scale's cultural specificity limits cross-national comparability. Studies in peer-reviewed sociology journals affirm its predictive validity for intergroup contact avoidance, grounded in observable relational patterns rather than unverified ideological assumptions. Cultural distance quantifies disparities in societal norms, values, and practices between groups or nations, influencing interactions in and . Geert Hofstede's framework, derived from employee surveys across 70+ countries in the 1970s-1980s, operationalizes this via six dimensions—, , , , long-term orientation, and indulgence—each scored 0-100 based on aggregated responses. Distance between countries is typically calculated as the across these scores, with higher values indicating greater divergence; for example, the U.S. ( score 91) exhibits substantial distance from (6). In research, meta-analyses of 59 studies (covering 10,428 firm-year observations up to 2018) demonstrate that larger cultural distances correlate with reduced and entry modes favoring joint ventures over wholly-owned subsidiaries, as firms mitigate coordination costs from value mismatches. A study of global investments found that a one-standard-deviation increase in cultural distance reduces returns by 0.5-1.2 points annually, driven by causal factors like communication barriers and deficits rather than mere . While Hofstede's indices face academic scrutiny for aggregation biases—potentially overlooking subnational variations or temporal shifts—validity tests confirm their over null models in predicting trade volumes and adjustment failures. Alternative indices, such as those incorporating linguistic or religious distances, yield convergent findings, underscoring empirical robustness despite institutional biases in scholarship favoring equivalence assumptions.

Distance Between Sets

In a (X, d), the distance between two nonempty subsets A, B \subseteq X is defined as d(A, B) = \inf \{ d(x, y) \mid x \in A, \, y \in B \}. This infimum quantifies the minimal possible separation between elements of the sets, serving as the greatest lower bound on pairwise distances. The function is symmetric, d(A, B) = d(B, A), nonnegative, and satisfies d(A, A) = 0 if A is nonempty. However, d(A, B) = 0 does not imply A = B; it holds whenever the closures of A and B intersect. This set distance fails to satisfy the d(A, B) \leq d(A, C) + d(C, B) for arbitrary nonempty C \subseteq X. For instance, in \mathbb{R} with the metric, let A = \{0\}, B = \{10\}, and C = \{0, 10\}. Then d(A, B) = 10, while d(A, C) = 0 and d(C, B) = 0, violating the . The failure arises because the infimum captures direct minimal distances without accounting for intermediate connections within C. In cases where the sets are closed and disjoint, d(A, B) > 0 indicates positive separation. To address these limitations and induce a on the of compact subsets, the is employed: d_H(A, B) = \max \left( \sup_{x \in A} \inf_{y \in B} d(x, y), \, \sup_{y \in B} \inf_{x \in A} d(y, x) \right). Equivalently, d_H(A, B) is the infimum of radii r \geq 0 such that A is contained in the r-neighborhood of B and . The directed Hausdorff distance from A to B, \tilde{d}_H(A, B) = \sup_{x \in A} \inf_{y \in B} d(x, y), measures the maximum extent to which points in A deviate from B; the full takes the maximum over both directions. For compact subsets of a , d_H satisfies the metric axioms: d_H(A, B) = 0 if and only if A = B, symmetry, and the d_H(A, B) \leq d_H(A, C) + d_H(C, B). The finds applications in analyzing of set sequences, where d_H(A_n, A) \to 0 implies A_n converges to A in the Hausdorff , and in fields like for shape matching and , where it robustly compares point clouds or boundaries despite noise or partial overlaps.

Displacement, Directed Distance, and Signed Distance

In physics, is defined as the change in position of an object, measured as the straight-line path from its initial to final coordinates, incorporating both magnitude and direction. This contrasts with distance, a scalar quantity representing the total of the path traversed, which ignores direction and can exceed the magnitude in . For instance, an object moving 3 meters east and then 4 meters north has a magnitude of 5 meters (via ), while the total distance traveled is 7 meters. In one-dimensional , adopts a relative to a coordinate , yielding positive values for motion in the positive and negative for the opposite, thus embodying directed distance along a line. Mathematically, for points a and b on a real line, the directed distance from a to b is b - a, which can be positive, negative, or zero, distinguishing it from the unsigned distance |b - a|. This facilitates analysis in linear spaces, where vectors in higher dimensions generalize the concept, with components reflecting directed changes in each coordinate. The signed distance extends beyond one dimension into geometric contexts, particularly as the (SDF) to a or in . For a point x and set \Omega with \partial \Omega, the SDF is \phi(x) = \dist(x, \partial \Omega) \cdot \sgn, where \dist is the to the nearest point, and the is positive outside \Omega, negative inside, and zero on \partial \Omega. This function satisfies the |\nabla \phi| = 1 away from medial axes, enabling applications in methods for evolving interfaces and in for rendering implicit surfaces. In directed distance formulations for lines or planes, the indicates relative orientation, such as with a direction.

Distance Travelled Versus Net Displacement

Distance traveled refers to the total length of the path an object follows during its motion, regardless of direction, and is a scalar quantity measured in units such as . It accumulates all segments of movement, including any deviations or reversals, making it always non-negative. For instance, if an object moves 3 forward and then 2 backward, the distance traveled is 5 . Net displacement, in contrast, is the vector change in position from the initial to the final point, incorporating both and . Its represents the shortest straight-line between start and end points, also a scalar but directionless in that context. Using the prior example, the net is 1 meter forward, with of 1 meter. Mathematically, for a particle in one , \Delta x = x_f - x_i, while distance traveled is \int |v| \, dt over the interval. The key distinction arises because distance traveled accounts for the entire , whereas the magnitude of net ignores intermediate paths and only considers endpoints; thus, distance traveled is always greater than or equal to the magnitude of net , with equality holding for direct, unidirectional motion without reversals. In closed paths, such as a returning to the origin, net is zero while distance traveled equals the . This difference is fundamental in , where average speed uses distance traveled over time, but average velocity uses net over time.

References

  1. [1]
    Distance -- from Wolfram MathWorld
    The distance between two points is the length of the path connecting them. In the plane, the distance between points (x_1,y_1) and (x_2,y_2) is given by the ...
  2. [2]
    Distance: Definitions and Examples - Club Z! Tutoring
    In mathematics, the formal definition of distance between two points in Euclidean space is the length of the shortest path connecting the two points. This ...
  3. [3]
    Distance formula | Analytic geometry (video) - Khan Academy
    Aug 30, 2016 · We can rewrite the Pythagorean theorem as d=√((x_2-x_1)²+(y_2-y_1)²) to find the distance between any two points. Created by Sal Khan and CK-12 Foundation.
  4. [4]
    Distance versus Displacement - The Physics Classroom
    Distance is a scalar quantity that refers to "how much ground an object has covered" during its motion.
  5. [5]
    Distance Between Two Points - Department of Mathematics at UTSA
    Oct 26, 2021 · In mathematics, the Euclidean distance between two points in Euclidean space is the length of a line segment between the two points.
  6. [6]
    Distance Between Two Points - Formula, Derivation, Examples
    Distance between two points is the length of the line segment that connects the two given points. Learn to calculate the distance between two points formula ...<|separator|>
  7. [7]
    Distance and Displacement - The Physics Hypertextbook
    Distance is a scalar measure of an interval measured along a path. Displacement is a vector measure of an interval measured along the shortest path.
  8. [8]
    The Distance Formula
    According to the Euclidean distance formula, the distance between two points ... = 5. The source of this formula is in the Pythagorean theorem. Look at ...
  9. [9]
    Distance Metrics - Math.NET Numerics
    A metric or distance function is a function d(x,y) d ( x , y ) that defines the distance between elements of a set as a non-negative real number.
  10. [10]
    [PDF] Origins of Measurement | PDH Academy
    Early measurement units came from human anatomy, like cupped hands, fingers, and strides. The cubit, based on arm length, was also used. Sumerians invented ...
  11. [11]
    Measurement – a timeline - Science Learning Hub
    Aug 19, 2019 · The cubit, considered as the first recorded standard length measurement, appears in ancient Egypt. It is defined by the length of the Pharaoh's ...
  12. [12]
    The history of measurement - MacTutor - University of St Andrews
    Ancient measurement of length was based on the human body, for example the length of a foot, the length of a stride, the span of a hand, and the breadth of a ...
  13. [13]
    HISTORY OF MEASUREMENT - HistoryWorld
    The mile is in origin the Roman mille passus - a 'thousand paces', approximating to a mile because the Romans define a pace as two steps, bringing the walker ...
  14. [14]
    [PDF] A Brief HISTORY - National Institute of Standards and Technology
    May 9, 2017 · These measures had their origins in a variety of cultures --. Babylonian, Egyptian, Roman,. Anglo-Saxon, and Norman French. The ancient "digit," ...
  15. [15]
    [PDF] Encyclopedia of Distances | UCO
    Maurice Fréchet (1878–1973) coined in 1906 the concept of écart. (semi-metric). Felix Hausdorff (1868–1942) coined in 1914 the term metric space. Page 5 ...
  16. [16]
    [PDF] Lecture Notes in Modern Geometry 1 The euclidean plane
    The euclidean distance is a function d : R2 × R2 → R defined as d((x1,y1),(x2,y2)) = p. (x1 − x2)2 + (y1 − y2)2. This function represents Pythagorean ...
  17. [17]
    [PDF] 1: Geometry and Distance - Harvard Mathematics Department
    The Euclidean distance between two points P = (x, y, z) and Q = (a, b, c) in space is defined as d(P, Q) = q (x - a)2 + (y - b)2 + (z - c)2. Note that this is ...
  18. [18]
    Euclidean Distance In 'n'-Dimensional Space
    1. a = APHWcell1 - APHW · 2. b = mTaucell1 - mTau · 3. d2 = a2 + b2 and · 4. c = eEPSCcell1 - eEPSC · 5. and e2 = d2 + c substituting a2 + b2 for c2 from line 3 ...Missing: source | Show results with:source
  19. [19]
    SI base unit: metre (m) - BIPM
    The metre, symbol m, is the SI unit of length. It is defined by taking the fixed numerical value of the speed of light in vacuum c to be 299 792 458.
  20. [20]
    Understanding Euclidean Distance: From Theory to Practice
    Sep 13, 2024 · To find the Euclidean distance between two points using vectors, you essentially subtract one point from another to create a new vector. This ...
  21. [21]
    [PDF] Metric Spaces - UC Davis Math
    A metric space is a set X that has a notion of the distance d(x, y) between every pair of points x, y ∈ X. The purpose of this chapter is to introduce ...
  22. [22]
    Great Circle -- from Wolfram MathWorld
    A great circle is a section of a sphere that contains a diameter of the sphere (Kern and Bland 1948, p. 87). Sections of the sphere that do not contain a ...
  23. [23]
    [PDF] 18.900 Spring 2023 Lecture 34: The Hyperbolic Plane
    For us, their first appearance is in the following equivalent, and more convenient, formula for the distance: (34.12) cosh(dist(z, w)) − 1 = |z − w|2. 2 im ...
  24. [24]
    Hyperbolic Geometry -- from Wolfram MathWorld
    A non-Euclidean geometry, also called Lobachevsky-Bolyai-Gauss geometry, having constant sectional curvature -1. This geometry satisfies all of Euclid's ...
  25. [25]
    [PDF] LECTURE 3: THE RIEMANNIAN DISTANCE 1. Length of curves
    The Riemannian distance between p and q is the infimum of the length of curves connecting p and q, where the length of a curve is defined using the Riemannian ...
  26. [26]
    [PDF] Lecture 9. Riemannian metrics
    1 A Riemannian metric g on a smooth manifold M is a smoothly chosen inner product gx : TxM × TxM → R on each of the tangent spaces TxM of M. In other words, for ...
  27. [27]
    [PDF] Length and Distances on Riemannian Manifolds
    Length of a curve is the integral of |γ0(t)|g dt. Riemannian distance between points is the infimum of curve lengths from those points.<|separator|>
  28. [28]
    5.4 Length Contraction - University Physics Volume 3 | OpenStax
    Sep 29, 2016 · Proper length L 0 L 0 is the distance between two points measured by an observer who is at rest relative to both of the points.<|separator|>
  29. [29]
    General Relativity - University of Pittsburgh
    In relativity, both special and general, they are timelike geodesics. That is, they are curves of greatest proper time, which is the analog of the straight ...
  30. [30]
    [astro-ph/9905116] Distance measures in cosmology - arXiv
    May 11, 1999 · Formulae for the line-of-sight and transverse comoving distances, proper motion distance, angular diameter distance, luminosity distance, k-correction, ...
  31. [31]
    MEASUREMENT ERRORS IN GEODESY - Zenodo
    May 27, 2025 · This paper classifies measurement errors in geodesy into random, systematic and gross varieties, reviews their physical origins, outlines ...
  32. [32]
    [PDF] Using the fundamentals of the theory of measurement errors in ...
    Abstract. In this article, the tasks of the theory of measurement errors during geodetic measurement and calculation practices, equal and unequal.
  33. [33]
    [PDF] Chapter 3 Surveying Equipment, Measurements and Errors
    May 1, 2005 · May 2005 SURVEYING EQUIPMENT, MEASUREMENTS AND ERRORS 3.1(11) ... A survey may require vertical measurement in addition to linear and angular.
  34. [34]
    Error and Uncertainty in Geodetic Measurements - LinkedIn
    Apr 6, 2023 · 1. Instrumental errors ; 2. Observational errors ; 3. Computational errors ; 4. Geometric errors ; 5. Systematic errors.4 Geometric Errors · 5 Systematic Errors · 6 Random Errors
  35. [35]
    10.2 Consequences of Special Relativity - Physics | OpenStax
    Mar 26, 2020 · Relativistic Effects on Time, Distance, and Momentum. Consideration of the measurement of elapsed time and simultaneity leads to an important ...
  36. [36]
    28.3: Length Contraction - Physics LibreTexts
    Jul 11, 2021 · Distance depends on an observer's motion. Proper length L 0 is the distance between two points measured by an observer who is at rest relative ...Proper Length · Length Contraction · Summary
  37. [37]
  38. [38]
    Challenges and Innovations in Measuring Distances to Celestial ...
    Jan 4, 2025 · This article examines the reliability of current distance measurement techniques and evaluates the likelihood that significant inaccuracies exist.
  39. [39]
    Maurice Fréchet (1878 - 1973) - Biography - MacTutor
    Maurice Fréchet was a French mathematician who made major contributions to the topology of point sets and defined and founded the theory of abstract spaces.
  40. [40]
    [PDF] Chapter 1 Metric Spaces
    Comment: Metric spaces were first introduced by Maurice Frechet in 1906, uni- fying work on function spaces by Cantor, Volterra, Arzel`a, Hadamard, Ascoli,.
  41. [41]
    Excess axioms in the definition of a metric - Math Stack Exchange
    Dec 23, 2018 · There are even very simple examples of asymmetric metrics: e.g., what you might call the morning rush-hour metric between a city A and a suburb ...
  42. [42]
    Euclidean Distance - an overview | ScienceDirect Topics
    Euclidean distance is defined as the metric that determines the distance between two vectors by calculating the square root of the sum of the squared ...<|separator|>
  43. [43]
    Euclidean Distance - GeeksforGeeks
    Jul 23, 2025 · Euclidean distance is defined as the distance between two points in Euclidean space. To find the distance between two points, the length of the line segment ...Missing: source | Show results with:source
  44. [44]
    Manhattan Distance - an overview | ScienceDirect Topics
    Manhattan distance is the sum of the absolute differences of coordinates between two points, like walking in a grid, such as in Manhattan.
  45. [45]
    Manhattan Distance - GeeksforGeeks
    Aug 26, 2025 · Manhattan Distance, also known as L1 or taxicab distance, measures how far apart two points are by summing the absolute differences of their ...
  46. [46]
    Chebyshev Distance - an overview | ScienceDirect Topics
    Chebyshev distance is a metric defined on a vector space where the distance between two vectors is the greatest of their differences along any coordinate ...
  47. [47]
    Chebyshev distance - GeeksforGeeks
    Jul 26, 2025 · Chebyshev Distance is a metric used to calculate the distance between two points in a space where movement can occur in any direction ...
  48. [48]
    minkowski — SciPy v1.16.2 Manual
    Compute the Minkowski distance between two 1-D arrays. The Minkowski distance between 1-D arrays u and v, is defined as
  49. [49]
    Minkowski Distance: A Comprehensive Guide - DataCamp
    Oct 9, 2024 · Minkowski distance is a way of measuring the straight or curved path between two points, depending on a chosen parameter that affects the shape.What is Minkowski Distance? · Generalization of other... · Metric space properties
  50. [50]
    Hamming Distance - an overview | ScienceDirect Topics
    Hamming distance refers to the number of positions at which two strings of the same length differ. It is a metric used in computer science to measure ...
  51. [51]
    Concepts of hamming distance - GeeksforGeeks
    Jul 23, 2025 · The Hamming distance between two integers is the number of positions at which the corresponding bits are different.
  52. [52]
    [PDF] 1 Distances and Metric Spaces - TTIC
    The distance is said to be a metric if the triangle inequality holds, i.e., d(i, j) ≤ d(i, k) + d(k, j). ∀i, j, k ∈ X. It is often required that metrics should ...
  53. [53]
    Understanding Distance Metrics Used in Machine Learning
    Apr 4, 2025 · Distance metrics deal with finding the proximity or distance between data points and determining if they can be clustered together.
  54. [54]
    Graph Distance -- from Wolfram MathWorld
    The distance d(u,v) between two vertices u and v of a finite graph is the minimum length of the paths connecting them (ie, the length of a graph geodesic).
  55. [55]
    [PDF] Distance in Graphs - Clemson University
    The distance between two vertices is the basis of the definition of several graph parameters including diameter, radius, average distance and metric dimension.
  56. [56]
    [PDF] Graph Theory
    Dis- tance functions, called metrics, are used in many different areas of mathematics, and they have three defining properties. ... Distance in graphs is defined ...
  57. [57]
    Eccentricity, Radius, Diameter, Center, and Periphery - Baeldung
    Mar 18, 2024 · In this tutorial, we'll explain five concepts from graph theory: eccentricity, radius, diameter, center, and periphery.
  58. [58]
    Graph Measurements - GeeksforGeeks
    Aug 28, 2025 · A radius of the graph exists only if it has a diameter. The minimum ... The radius is the minimum eccentricity of any vertex in the graph.
  59. [59]
    Shortest Paths - Algorithms, 4th Edition
    Jan 10, 2025 · A shortest path from vertex s to vertex t is a directed path from s to t with the property that no other such path has a lower weight.Missing: discrete | Show results with:discrete
  60. [60]
    Shortest Path in Directed Acyclic Graph - GeeksforGeeks
    Jul 23, 2025 · Given a Weighted Directed Acyclic Graph and a source vertex in the graph, find the shortest paths from given source to all other vertices.<|separator|>
  61. [61]
    5 Paths and Distance | Handbook of Graphs and Networks in People ...
    For a path from vertex A A to vertex B B in a graph, the length of the path is the sum of the weights of the edges traversed in the path. If a graph does not ...
  62. [62]
    Hamming Distance Explained: The Theory and Applications
    Apr 16, 2025 · Hamming distance measures the number of positions at which two strings of equal length have different symbols.What is Hamming Distance? · Python implementation · implementation
  63. [63]
    Hamming Distance vs. Levenshtein Distance - GeeksforGeeks
    Jul 23, 2025 · This article will explore the definitions, use cases, and differences between Hamming Distance and Levenshtein Distance, helping you choose the right metric ...
  64. [64]
    [PDF] Lecture Notes 27 36-705 1 The Fundamental Statistical Distances
    Today we will discuss distances and metrics between distributions that are useful in statistics. ... Let P, Q be two probability measures with densities p and q.
  65. [65]
    Measures of distance between probability distributions - ScienceDirect
    Hellinger coefficient, Jeffreys distance, Chernoff coefficient, directed divergence, and its symmetrization J-divergence are examples of such measures.
  66. [66]
    Kullback-Leibler divergence - StatLect
    The Kullback-Leibler divergence is a measure of the dissimilarity between two probability distributions.Definition · The KL divergence is non... · Asymmetry
  67. [67]
    [PDF] Shannon Entropy and Kullback-Leibler Divergence
    Because of Lemma 360, we can say that KL divergence has some of the properties of a metric on the space of probability distribution: it's non-negative, with ...<|separator|>
  68. [68]
    About the Hellinger distance - Djalil Chafai
    Jan 22, 2020 · The Hellinger distance is equivalent topologically and close metrically to the total variation distance, in the sense that. H2(μ,ν)≤2∥μ− ...
  69. [69]
    bhattacharyyaDistance - One-dimensional Bhattacharyya distance ...
    The Bhattacharyya distance provides a metric for ranking features according to their ability to separate two classes of data, such as data from healthy and ...Missing: statistics | Show results with:statistics
  70. [70]
    Jensen Shannon Divergence vs Kullback-Leibler Divergence?
    Sep 29, 2014 · Here, the JS divergence would be the same due to its symmetric property, while KL of H from L would be much lower. In short, we expected using H ...JS-divergence vs KL-divergence as objective for GAN?Does the Jensen-Shannon divergence maximise likelihood?More results from stats.stackexchange.com
  71. [71]
    On a Generalization of the Jensen–Shannon Divergence and ... - NIH
    The Jensen–Shannon divergence is a renown bounded symmetrization of the Kullback–Leibler divergence which does not require probability densities to have ...<|control11|><|separator|>
  72. [72]
    [PDF] Distances and Divergences for Probability Distributions
    Total Variation Distance. Definition: Let X be a set with a sigma-field A. The total variation distance between two probability measures P and Q on (X, A) is.Missing: statistical | Show results with:statistical
  73. [73]
    Levenshtein Distance - an overview | ScienceDirect Topics
    Levenshtein distance is defined as the minimum number of insertions, deletions, or substitutions required to transform one string into another. It serves as a ...Formal Definition and... · Algorithms for Computing... · Applications in Computer...
  74. [74]
    [PDF] Minimum Edit Distance - Stanford University
    Minimum edit distance is the minimum number of edits (insertions, deletions, substitutions) needed to transform one string into another.
  75. [75]
    Measuring the distance between multiple sequence alignments
    Many methods used in bioinformatics require one or more accurate multiple sequence alignments (MSAs) as input. Each MSA arranges a set of homologous amino acid ...Abstract · INTRODUCTION · METHODS · RESULTS
  76. [76]
    [2203.06138] A New String Edit Distance and Applications - arXiv
    Mar 11, 2022 · String edit distances have been used for decades in applications ranging from spelling correction and web search suggestions to DNA analysis.
  77. [77]
    Alignment-free sequence comparison: benefits, applications, and tools
    Oct 3, 2017 · Alignment-free sequence analyses have been applied to problems ranging from whole-genome phylogeny to the classification of protein families.<|separator|>
  78. [78]
    [2401.00422] Interpreting the Curse of Dimensionality from Distance ...
    Dec 31, 2023 · We delve into two major causes of the curse of dimensionality, distance concentration and manifold effect, by performing theoretical and empirical analyses.
  79. [79]
    Effectiveness of the Euclidean distance in high dimensional spaces
    Euclidean distance becomes less effective with increased dimensionality, but is useful in some cases, even above 10,000, and more effective with more samples.
  80. [80]
    The Curse of Dimensionality in Machine Learning - DataCamp
    Sep 13, 2023 · In high dimensions, the difference in distances between data points tends to become negligible, making measures like Euclidean distance less ...
  81. [81]
    Curse of Dimensionality in Machine Learning - GeeksforGeeks
    Jul 23, 2025 · Curse of Dimensionality in Machine Learning arises when working with high-dimensional data, leading to increased computational complexity, overfitting, and ...
  82. [82]
    Do all roads lead to Rome? Studying distance measures in the ...
    This paper aims to provide an overview of the use of these measures in the most common machine learning problems, pointing out those aspects to consider.
  83. [83]
    Distance Metrics in High Dimensions | by Mayur Jain | MLWorks
    Sep 21, 2025 · It measures or quantifies the difference or similarity between two data points. And understanding when to use which metric, and when to avoid ...<|separator|>
  84. [84]
    Distance metrics for high dimensional nearest neighborhood recovery
    The overall rationale behind this paper is to help understand the effects of different distance metrics on the accurate representation of high dimensional data ...
  85. [85]
    Construal-Level Theory of Psychological Distance - PMC
    The basic premise of CLT is that distance is linked to level of mental construal, such that more distant objects will be construed at a higher level, and high- ...
  86. [86]
    Construal Levels and Psychological Distance - PubMed Central
    Construal level theory (CLT) is an account of how psychological distance influences individuals' thoughts and behavior. CLT assumes that people mentally ...
  87. [87]
    Construal-level theory of psychological distance. - APA PsycNet
    Psychological distance is egocentric: Its reference point is the self in the here and now, and the different ways in which an object might be removed from that ...
  88. [88]
    So Gross and Yet so Far Away: Psychological Distance Moderates ...
    Aug 16, 2017 · Studies 2 and 4 show that incidental disgust affects moral judgment of distant (vs. near) violations and that the moderating role of distance is ...
  89. [89]
    [PDF] Proffitt (2003) The role of effort in perceiving distance - Mark Wexler
    Consistent with his proposal, the present studies show that per- ceived egocentric distance increases when people are encumbered by wearing a heavy backpack or ...<|separator|>
  90. [90]
    Psychological influences on distance estimation in a virtual reality ...
    Sep 17, 2013 · Researches on embodied perception have revealed that social, psychological and physiological factors influence perception of space.
  91. [91]
    Social processing distorts physical distance perception - Nature
    Feb 15, 2025 · Some disorders can affect distance estimation. Socially anxious individuals prefer greater interpersonal distance from strangers and estimate ...
  92. [92]
    Estimating Distance in Real and Virtual Environments - NIH
    This investigation examined how the order in which people experience real and virtual environments influences their distance estimates.
  93. [93]
    Distance estimation from cognitive maps - ScienceDirect.com
    Psychophysical power functions relating true to estimated distance provided a good fit to both memory and perception data. These results suggest an analogy ...
  94. [94]
    The Various Perceptions of Distance: An Alternative View of How ...
    Informal discussion often reveals that people directly equate “poor distance perception” with their sense of unfamiliarity with assigning numbers to distances.
  95. [95]
    [PDF] Creating a More Sensitive and Dynamic Social Distance Scale
    In the 1920s, Bogardus (1926) pioneered a series of studies on social distance, which he defined as “the degree of sympathetic understanding that functions.
  96. [96]
    Emory Bogardus and the Origins of the Social Distance Scale
    Dec 4, 2007 · This article provides some history of sociology by focusing on the origins of the Bogardus Social Distance Scale. The scale was developed by ...<|separator|>
  97. [97]
    Updating the Bogardus social distance studies: a new national survey
    This study seeks to gauge the changes in sentiment towards various U.S. ethnic and religious groups by updating and replicating the Bogardus social distance ...
  98. [98]
    Measuring Social Distance Toward Individuals With Special Needs
    Mar 2, 2022 · Social distance toward individuals with special needs was examined in terms of education level, gender, age, and frequency of contact with ...
  99. [99]
    [PDF] The National Social Distance Study: Ten Years Later
    Sep 1, 2013 · The Bogardus social distance scale, which measures the level of acceptance that Americans feel toward members of the most common ethnic and ...<|separator|>
  100. [100]
    Improving upon Bogardus: Creating a More Sensitive and Dynamic ...
    Sep 30, 2017 · The Bogardus Social Distance Scale employs Guttman's rank order scaling system to measure the perceived social distance between individuals, ...
  101. [101]
    Revisiting Bogardus's social distance concept in a time of COVID-19
    On the Bogardus Social Distance Scale, there was support for a disease avoidance hypothesis: greater social distancing preferences were expressed under higher ...
  102. [102]
    The 6 dimensions model of national culture by Geert Hofstede
    These are called dimensions of culture. Each of them has been expressed on a scale that runs roughly from 0 to 100.Dimension data matrix · Hofstede’s Globe · Gert Jan's blog · Our books
  103. [103]
    Measuring Cultural Dimensions: External Validity and Internal ...
    Apr 5, 2021 · To capture and compare cultural values across cultures, Geert Hofstede's Cultural Dimensions Theory has offered an influential framework.Introduction · Methods · Results · Discussion
  104. [104]
    Cultural Distance and Firm Internationalization: A Meta-Analytical ...
    This paper presents the most comprehensive review and meta-analysis of the literature on cultural distance and firm internationalization to date.
  105. [105]
    International investments impacted by cultural distances, research ...
    Jun 22, 2022 · The larger the cultural distance between an investor's home country and the country they invest in, the worse it performs, research finds.
  106. [106]
    (PDF) Cultural distance in international business and management
    Aug 6, 2025 · The six variables related to the cultural dimensions were based upon the following indices: Power Distance Index (PDI); ii) Individualism Index ...
  107. [107]
    Conceptualizing and measuring distance in international business ...
    Oct 15, 2018 · Distance is a central concept in international business research, yet there is debate about the construct as well as its operationalization.<|separator|>
  108. [108]
    [PDF] General Topology. Part 4: Metric Spaces - CSUSM
    Feb 2, 2020 · Definition 9 (Distance between Sets). Let X be a metric space with metric d. Let Y and Z be nonempty subsets of X. Then the infimum of d(y ...
  109. [109]
    [PDF] Distributions and Operators for Physicists - University of Florida
    Distance between sets. A distance between sets A and B is de- fined by d(A, B) = inf x∈A, y∈B. |x − y|, which is the greatest lower bound (infimum) for ...
  110. [110]
    [PDF] The contraction mapping theorem, II - Keith Conrad
    For A, B ∈ H(X), their Hausdorff distance is. dH(A, B) = inf{r ≥ 0 : A ⊂ Er(B) and B ⊂ Er(A)}. Let's check the definition makes sense.
  111. [111]
    Hausdorff Distance Image Comparison - Cornell: Computer Science
    It is well known that the Hausdorff distance is a metric over the set of all closed, bounded sets - it obeys the properties of identity, symmetry and triangle ...
  112. [112]
    [PDF] Comparing images using the Hausdorff distance - People @EECS
    Abstract The Hausdorff distance measures the extent to which each point of a “model” set lies near some point of an “image” set and vice versa. Thus, this ...
  113. [113]
    [PDF] Efficiently Locating Objects Using the Hausdorff Distance
    Abstract. The Hausdorff distance is a measure defined between two point sets, here representing a model and an image. The Hausdorff distance is reliable ...
  114. [114]
    Distance and displacement introduction (video) | Khan Academy
    Mar 31, 2018 · Not exactly. The displacement is the distance from the original point to the end-point. For example, if you were to walk in a rectangular pattern, and stop ...
  115. [115]
    properties/distance - calculator.org
    A directed distance is a distance with a direction, or a sense (vector quantity). Along a straight line, a directed distance is a vector joining any two points ...
  116. [116]
    Signed Distance Functions | SpringerLink
    We define signed distance functions to be positive on the exterior, negative on the interior, and zero on the boundary.
  117. [117]
    2.1 Relative Motion, Distance, and Displacement - Physics | OpenStax
    Mar 26, 2020 · Help students learn the difference between distance and displacement by showing examples of motion. ... Distance: The distance traveled is ...