Fact-checked by Grok 2 weeks ago

Topological entropy

Topological entropy is a nonnegative real number that serves as an invariant measuring the complexity of a topological dynamical system, specifically the exponential growth rate of the number of distinguishable orbits under iteration of a continuous map on a compact topological space. Introduced in 1965 by Roy L. Adler, Alan G. Konheim, and M. H. McAndrew, topological entropy was defined using open covers of the space, where the entropy of a map \phi: X \to X is the supremum over all finite open covers \mathcal{U} of \lim_{n \to \infty} \frac{1}{n} \log N(\mathcal{U}^n), with N(\mathcal{U}^n) denoting the minimal number of sets needed to cover the space under the n-fold refinement of the cover by iterates of \phi. This definition captures the global exponential complexity in the orbit structure without relying on a specific metric or measure. Key properties include invariance under topological conjugacy, meaning h(\psi \circ \phi \circ \psi^{-1}) = h(\phi) for a homeomorphism \psi, scalability such that h(\phi^k) = |k| h(\phi) for integer k, and additivity for product systems, h(\phi_1 \times \phi_2) = h(\phi_1) + h(\phi_2). In the early 1970s, Rufus Bowen provided an equivalent metric formulation that extended the concept to noncompact spaces and proved more amenable to computation, defining the entropy h_d(T) for a uniformly continuous map T on a (X, d) as \sup_K \lim_{\epsilon \to 0} \limsup_{n \to \infty} \frac{1}{n} \log r_n(\epsilon, K), where r_n(\epsilon, K) is the minimal size of an (n, \epsilon)-spanning set for a compact K \subseteq X. This approach, equivalent to the original on compact metric spaces, facilitated connections to , such as subshifts of finite type, where the equals the logarithm of the growth rate of periodic points or the topological entropy of the associated . Bowen's work also highlighted applications to group endomorphisms and homogeneous spaces, computing for affine maps on compact groups as the sum of logarithms of eigenvalues greater than 1 in modulus. Topological entropy relates to measure-theoretic entropy through the , which states that the topological entropy equals the supremum of entropies over all probability measures, providing a bridge between topological and probabilistic . This , established in the , underscores its role in classifying chaotic behavior, such as in Axiom A diffeomorphisms where entropy equals the growth rate of periodic orbits. Applications extend to flows on manifolds, rigidity theorems in , and even arithmetic problems like Littlewood's on .

Introduction

Overview and Motivation

Topological entropy is that quantifies the of in dynamical systems defined by and a continuous self-map. It provides a measure of the inherent chaos or unpredictability in such systems by assessing how rapidly the number of meaningfully distinguishable orbits expands under repeated iterations of the map. Introduced by Adler, Konheim, and McAndrew, this concept assigns a nonnegative real number to the transformation, reflecting the system's dynamical without reliance on specific metrics or measures. The development of topological entropy was motivated by advancements in , where continuous maps are approximated by symbolic representations via partitions into symbolic sequences. This framework highlighted the need for a complexity measure under —ensuring that topologically equivalent systems (related by homeomorphisms) yield the same value—unlike metric or measure-dependent alternatives in nonlinear dynamics. By drawing from the growth rates of admissible symbol sequences in subshifts, topological entropy offers a robust, conjugacy-preserving tool to classify and compare the intricacy of orbits across diverse systems. Conceptually, topological entropy parallels thermodynamic by gauging the proliferation of "" or uncertainty in the system's , akin to how the latter tracks the increase in microscopic disorder from macroscopic constraints. In dynamical contexts, it embodies the minimal rate required to specify evolving orbits, underscoring shared themes of irreversibility and complexity growth across disciplines.

Historical Development

The concept of topological entropy was first introduced in 1965 by Roy L. Adler, Alan G. Konheim, and Michael H. McAndrew, who defined it as a topological for continuous self-maps on compact spaces, quantifying the rate of orbit complexity independent of any metric structure. Their cover-based approach drew inspiration from measure-theoretic but focused on purely topological features, establishing foundational properties such as monotonicity under factor maps. Independently, E. I. Dinaburg developed a formulation in 1970 using the growth of in compact spaces, providing an alternative characterization that emphasized the role of diameters in distinguishing orbits. Rufus Bowen, in 1971, similarly proposed a definition based on spanning and with respect to the Bowen , demonstrating its to the original Adler-Konheim-McAndrew version for compact spaces and extending applicability to uniformly continuous maps. Following these developments, the saw extensions beyond strictly compact settings; for instance, Bowen in defined topological entropy for noncompact subsets of compact spaces by adapting separated set counts to resemble Hausdorff dimension calculations, allowing analysis of local dynamics on invariant sets. Further generalizations to fully noncompact spaces and continuous flows emerged in subsequent works, addressing limitations in uniformity assumptions. By the early 1980s, Peter Walters' comprehensive treatment in his 1982 monograph unified these strands, elucidating connections to through the supremum of measure-theoretic entropies over invariant measures and extending definitions to broader classes of dynamical systems. Early definitions highlighted gaps, such as inconsistencies between cover-based and metric approaches under non-uniform continuity or on noncompact domains, prompting refinements in the 1980s toward cohesive frameworks that reconciled variants via equivalence proofs and variational characterizations.

Definitions

Cover-Based Definition

Topological entropy provides a measure of the complexity of a continuous dynamical system on a compact topological space, quantifying the exponential growth rate of orbit distinctions without relying on a measure. The original definition, introduced by Adler, Konheim, and McAndrew, employs open covers to capture this growth in a purely topological manner. For a compact X and a continuous map f: X \to X, the topological h_{\text{top}}(f) is defined as h_{\text{top}}(f) = \sup_{\mathcal{U}} h(f, \mathcal{U}), where the supremum is taken over all finite open covers \mathcal{U} of X, and h(f, \mathcal{U}) = \lim_{n \to \infty} \frac{1}{n} \log N\left( \bigvee_{k=0}^{n-1} f^{-k} \mathcal{U} \right). Here, N(\mathcal{V}) denotes the minimal of a subcover of an open cover \mathcal{V}, and \bigvee_{k=0}^{n-1} f^{-k} \mathcal{U} is the join of the covers f^{-k} \mathcal{U} for k = 0, \dots, n-1, consisting of all finite intersections \bigcap_{k=0}^{n-1} U_k with U_k \in f^{-k} \mathcal{U}. This formulation assigns to each cover \mathcal{U} an value reflecting the average of the minimal subcover size under iterated preimages of f. The quantity h(f, \mathcal{U}) interprets topological entropy as the asymptotic growth rate of the "distinguishing power" of the cover \mathcal{U} under iterations of f: as n increases, the join cover \bigvee_{k=0}^{n-1} f^{-k} \mathcal{U} requires increasingly many sets to cover X, with the exponential base given by e^{h(f, \mathcal{U})}, indicating how finely the separates points along orbits. The supremum over all covers ensures the definition captures the maximal such rate across refinements, providing an intrinsic topological . For instance, in systems with behavior, this growth is rapid, reflecting high complexity. The existence of the limit in h(f, \mathcal{U}) follows from the submultiplicativity of the sequence H_n = \log N\left( \bigvee_{k=0}^{n-1} f^{-k} \mathcal{U} \right), which satisfies H_{m+n} \leq H_m + H_n for all positive integers m, n, since the join for m+n iterations refines into at most the product of the subcover sizes for m and n iterations. By Fekete's lemma applied to this non-negative subadditive sequence, \lim_{n \to \infty} H_n / n exists and equals \inf_{n \geq 1} H_n / n, which is finite due to the compactness of X. This cover-based approach is particularly advantageous for non-metric spaces, as it relies solely on the of X and the of f, without requiring distances or uniformity, thus extending naturally to arbitrary compact Hausdorff spaces. In contrast, equivalent definitions using apply primarily to spaces.

Separated Sets Definition

The separated sets definition of topological entropy, introduced independently by Dinaburg and Bowen, provides a metric-based approach suitable for compact spaces (X, d) and continuous maps f: X \to X. A E \subset X is called an (n, \varepsilon)-separated set if for any two distinct points x, y \in E, there exists some $0 \leq k < n such that d(f^k(x), f^k(y)) > \varepsilon. Let S(n, \varepsilon) denote the supremum of the cardinalities of all such (n, \varepsilon)-separated sets in X. The topological entropy is then defined as h_{\text{top}}(f) = \sup_{\varepsilon > 0} \lim_{n \to \infty} \frac{1}{n} \log S(n, \varepsilon), where the limit exists by the submultiplicativity of S(n, \varepsilon). This formulation captures the exponential growth rate of the maximal number of orbit segments of length n that can be distinguished up to precision \varepsilon, quantifying the complexity of the dynamics through the proliferation of distinct future behaviors. This definition is equivalent to the original cover-based definition of topological entropy, as established by Bowen using bounds on sets to relate to open covers of the . Specifically, the cardinalities of maximal provide for the minimal numbers of sets in \varepsilon-covers of the n-th iterate f^n(X), ensuring the entropies coincide in compact spaces. An analogous construction uses (n, \varepsilon)-spanning sets, where a E \subset X spans if for every x \in X, there exists y \in E such that \max_{0 \leq k < n} d(f^k(x), f^k(y)) < \varepsilon. Let r(n, \varepsilon) be the infimum of the cardinalities of such spanning sets. Bowen showed that h_{\text{top}}(f) = \sup_{\varepsilon > 0} \lim_{n \to \infty} \frac{1}{n} \log r(n, \varepsilon), and this coincides with the separated sets entropy, as spanning sets bound separated sets from above and below via simple inclusion arguments. This extension highlights the robustness of the definition across related combinatorial measures of dynamical separation.

Properties

Invariance and Continuity

Topological entropy is invariant under , meaning that if two continuous maps f: X \to X and g: Y \to Y on compact spaces are conjugate via a h: X \to Y satisfying h \circ f = g \circ h, then the topological entropy satisfies h_{\text{top}}(f) = h_{\text{top}}(g). This property follows directly from the definition, as the conjugacy preserves the structure of orbits and the growth rate of distinguished points or covers, ensuring that the exponential complexity remains unchanged. The topological entropy also exhibits upper with respect to the Hausdorff on the space of compact subsets of a fixed compact . Specifically, if a of compact subsets K_n converges to a compact K in the Hausdorff , then \limsup_{n \to \infty} h_{\text{top}}(f|_{K_n}) \leq h_{\text{top}}(f|_K), where f|_{K} denotes the restriction of the map f to K. This robustness ensures that small perturbations in the underlying space do not increase the entropy beyond that of the limit system, reflecting the of the under structural changes. Monotonicity holds for factor maps: if (Y, g) is a of the system (X, f) via a continuous surjective map \pi: X \to Y satisfying \pi \circ f = g \circ \pi, then h_{\text{top}}(f) \geq h_{\text{top}}(g). In this setting, the factor system captures a of the , so its complexity cannot exceed that of the original, as separated sets in Y lift to in X but not . This underscores topological entropy's role as a measure of dynamical richness that decreases or stays the same under quotient constructions. Regarding behavior under time-reversal and composition, topological entropy is preserved under inversion for homeomorphisms: if f is a homeomorphism, then h_{\text{top}}(f) = h_{\text{top}}(f^{-1}). More generally, for integer powers, h_{\text{top}}(f^k) = |k| \cdot h_{\text{top}}(f) for any integer k \neq 0, demonstrating linear scaling with iterations in either direction. These properties highlight the symmetry and multiplicativity of entropy with respect to the group's action generated by the map.

Variational Principle

The establishes a fundamental connection between topological entropy and measure-theoretic entropy for a continuous self-map f: X \to X on a compact X. Specifically, it states that the topological entropy satisfies h_{\text{top}}(f) = \sup_{\mu \in \mathcal{M}_f(X)} h_\mu(f), where \mathcal{M}_f(X) denotes the set of f-invariant probability measures on X, and h_\mu(f) is the Kolmogorov-Sinai entropy of f with respect to \mu. This equality bridges the topological notion, which captures the maximal of orbits without reference to a specific measure, and the measure-theoretic notion, which quantifies average orbit under a given measure. The proof proceeds in two directions. First, for any \mu \in \mathcal{M}_f(X), the measure entropy h_\mu(f) is bounded above by h_{\text{top}}(f), obtained by relating the entropy of measurable partitions to the minimal cardinality of subcovers in the topological definition, ensuring that metric entropy cannot exceed the growth rate of distinguishing sets. Second, the reverse inequality h_{\text{top}}(f) \leq \sup_{\mu \in \mathcal{M}_f(X)} h_\mu(f) is shown by constructing a sequence of invariant measures \mu_n supported on maximal (n,\varepsilon)-separated sets, whose entropies approximate h_{\text{top}}(f); by the weak^* compactness of \mathcal{M}_f(X), a limit measure \mu exists with h_\mu(f) \geq h_{\text{top}}(f). In systems possessing the specification property, such as mixing subshifts of finite type, the proof can leverage shadowing lemmas to refine approximations and ensure the existence of a measure achieving the supremum. A measure \mu \in \mathcal{M}_f(X) achieving h_\mu(f) = h_{\text{top}}(f) is called a measure of maximal . For dynamical systems with the specification property, Bowen proved the existence of at least one such measure by constructing it via periodic approximations that densely finite segments, exploiting the property to patch together arbitrary finite orbits with controlled error. Under additional assumptions, this measure is unique. This principle has key implications for computation: topological entropy can often be evaluated by identifying invariant measures (e.g., equilibrium states or measures in low-dimensional systems) and calculating their entropies, which may be feasible via representations or transfer operators when direct topological estimates are intractable. Moreover, since topological entropy is preserved under , the holds equivalently for conjugate systems.

Examples

Symbolic Dynamics

Symbolic dynamics provides a discrete framework for studying topological entropy through subshifts on sequences over finite alphabets. A subshift is a closed shift-invariant subset of the full shift space, where the full shift consists of all bi-infinite over a of symbols, equipped with the and the shift map σ that moves each sequence one position to the left. Topological entropy measures the exponential growth rate of the complexity in these systems, often computed via the number of (n, ε)-, as defined in the cover-based approach. The simplest example is the full shift on k symbols, denoted (Σ_k, σ), where Σ_k is the set of all bi-infinite sequences with entries in {1, ..., k}. Here, the maximum cardinality of an (n, ε)-separated set is exactly k^n for sufficiently small ε, since one can choose one point from each of the k^n distinct cylinders of length n, and these points are (n, ε)-separated. Thus, the topological entropy is h_top(σ) = \lim_{n \to \infty} \frac{1}{n} \log k^n = \log k. This value reflects the maximal possible entropy for subshifts on k symbols. A key class of subshifts is the subshifts of finite type (SFTs), defined by forbidding a finite set of words via a A, an irreducible 0-1 matrix indicating allowed transitions between symbols. The topological entropy of the associated shift σ_A is h_top(σ_A) = \log λ(A), where λ(A) > 1 is the Perron-Frobenius eigenvalue (spectral radius) of A, guaranteed by the Perron-Frobenius theorem for irreducible non-negative matrices. This follows from the growth rate of admissible words of length n being asymptotically λ(A)^n / (λ(A) - 1) or similar, yielding the entropy via separated sets. Topological Markov chains are equivalent to SFTs and can be represented by directed graphs, where vertices correspond to symbols and edges to allowed transitions, with A as the adjacency matrix. The entropy computation mirrors that of SFTs: h_top = \log λ(A), where λ(A) is the largest eigenvalue of the graph's . For instance, in a of length m, A is the permutation matrix for the cycle, λ(A) = 1, and h_top = 0, corresponding to a periodic orbit. This graphical view facilitates explicit calculations, as eigenvalues of small matrices are straightforward. Minimal subshifts of zero illustrate the lower bound. Sturmian subshifts, generated by irrational rotations on via codings, are minimal aperiodic subshifts on two symbols with exactly distinct words of length , by the Morse-Hedlund theorem. Since topological satisfies h_top = \limsup_{n \to \infty} \frac{1}{n} \log p(n) \leq \limsup_{n \to \infty} \frac{1}{n} \log () = 0, and is non-negative, h_top = 0 for Sturmian subshifts. These systems achieve the minimal positive complexity for aperiodic dynamics.

Geometric Systems

In geometric dynamical systems, topological entropy quantifies the complexity of orbits for continuous maps on smooth manifolds, often computed using geometric structures like stable and unstable manifolds or symbolic codings derived from them. Unlike symbolic dynamics, which deals with discrete shifts on sequence spaces, geometric examples involve diffeomorphisms or homeomorphisms on compact manifolds, where entropy arises from hyperbolic behavior and exponential growth of separated sets along unstable directions. These systems provide concrete illustrations of positive entropy in smooth settings, highlighting how local expansion rates determine global complexity. A classic example is the toral automorphism on the 2-torus \mathbb{T}^2 = \mathbb{R}^2 / \mathbb{Z}^2, induced by a hyperbolic integer A \in \mathrm{SL}(2, \mathbb{Z}) with eigenvalues \lambda > 1 and $1/\lambda < 1. The topological entropy is h_{\mathrm{top}}(f_A) = \log \lambda, reflecting the exponential growth of orbits due to the expanding eigenspace. This value equals the logarithm of the spectral radius of A, and it can be derived from the number of preimages or via a Markov partition that conjugates the dynamics to a subshift of finite type. For instance, the Arnold cat map with A = \begin{pmatrix} 2 & 1 \\ 1 & 1 \end{pmatrix} has \lambda = (3 + \sqrt{5})/2 \approx 2.618, yielding h_{\mathrm{top}} \approx 0.962. Anosov diffeomorphisms on compact Riemannian manifolds generalize this hyperbolicity, featuring a continuous invariant splitting of the tangent bundle into stable and unstable subbundles, where the map contracts along the stable directions and expands along the unstable ones uniformly. Such systems always exhibit positive topological entropy, as the exponential expansion in the unstable manifold dimensions drives the growth of the number of (n, \epsilon)-separated sets, with h_{\mathrm{top}}(f) > 0 bounded below by the minimal expansion rate in the unstable bundle. For example, linear Anosov maps on infranilmanifolds inherit their entropy from the eigenvalues of the inducing , similar to toral cases, while nonlinear perturbations preserve positivity due to the uniform hyperbolicity. The Smale on the unit square provides a prototypical geometric construction of , where the map stretches a into a horseshoe shape, folds it, and embeds it back, creating two disjoint bands that are mapped homeomorphically onto the original in the next iterate. This embeds a full shift on two symbols, yielding h_{\mathrm{top}} = \log 2 \approx 0.693, computed via the symbolic coding of the invariant formed by the intersection of preimages of the bands. The captures the doubling of branches at each step, illustrating how a local geometric mechanism generates maximal complexity for a piecewise linear . In contrast, non-hyperbolic geometric systems like the rotation R_\alpha: S^1 \to S^1, x \mapsto x + \alpha \pmod{1} with \alpha irrational, have zero topological , as the dynamics are minimal and equicontinuous, producing no exponential separation of orbits. The number of (n, \epsilon)-separated sets grows at most linearly with n, reflecting the rigid, non-expansive nature of rotations on .

Relations to Other Entropies

Measure-Theoretic Entropy

Measure-theoretic , also known as Kolmogorov-Sinai and denoted h_\mu(f), quantifies the average rate at which information is produced by the iterations of a measure-preserving transformation f on a (X, \mathcal{B}, \mu), where \mu is an f-invariant . It is computed as the supremum, over all finite measurable partitions \mathcal{A} of X, of the limit superior of the normalized of the n-th refinement \bigvee_{i=0}^{n-1} f^{-i} \mathcal{A}, capturing the exponential growth of uncertainty in predicting the system's behavior under the measure \mu. A fundamental relationship between topological and measure-theoretic entropy is the inequality h_{\top}(f) \geq h_\mu(f) for every f-invariant probability measure \mu on the compact space supporting f. This bound reflects the fact that topological entropy measures the maximal possible complexity across all invariant measures, while h_\mu(f) averages complexity with respect to a specific \mu. The inequality was established by Goodwyn under the assumption of a metric space and continuous f. The variational principle provides the precise link, stating that h_{\top}(f) = \sup \{ h_\mu(f) \mid \mu \text{ is an } f\text{-invariant probability measure on } X \}. This equality, completing the connection initiated by the upper bound, was proved by Goodman for general continuous maps on compact metric spaces. Invariant measures \mu attaining this supremum are termed equilibrium states (for the zero continuous potential) or measures of maximal entropy, and their existence is guaranteed under mild conditions on f, such as being expansive. An illustrative example of strict inequality occurs in , specifically the full two-sided shift \sigma on \{0,1\}^{\mathbb{Z}}, where h_{\top}(\sigma) = \log 2. For the uniform \mu_{1/2,1/2}, h_{\mu_{1/2,1/2}}(\sigma) = \log 2, achieving equality as an equilibrium state. In contrast, for a non-uniform measure \mu_{p,1-p} with $0 < p < 1/2, the measure-theoretic entropy is h_{\mu_{p,1-p}}(\sigma) = -p \log p - (1-p) \log (1-p) < \log 2, demonstrating that the bound is sharp but not always attained.

Algorithmic Complexity

The topological entropy of a dynamical system (X, f) serves as an upper bound on the asymptotic growth rate of the Kolmogorov complexity of orbit descriptions within the system. For any point x \in X, the limit superior \limsup_{n \to \infty} \frac{1}{n} K(f^n x) is at most h_{\text{top}}(f), where K denotes the Kolmogorov complexity, measuring the length of the shortest program that outputs the orbit segment. This bound arises because the total number of distinguishable orbit segments of length n grows at most exponentially as e^{n h_{\text{top}}(f)}, allowing any such segment to be described by an index of length approximately n h_{\text{top}}(f) + O(1) bits, plus a fixed prefix for the system. A pivotal result connecting these concepts is Brudno's theorem, which establishes an equality between measure-theoretic entropy and average orbit complexity for typical points. Specifically, for an ergodic invariant probability measure \mu on X with h_\mu(f) > 0, the limit \lim_{n \to \infty} \frac{1}{n} K(x, f^n x) = h_\mu(f) holds for \mu-almost every x \in X, where K(x, f^n x) is the complexity of the initial point and its first n iterates. Since h_{\text{top}}(f) = \sup \{ h_\mu(f) : \mu \text{ invariant} \}, this implies that topological entropy upper bounds the complexity growth for all orbits and achieves equality for typical orbits under measures realizing the supremum. The theorem, originally proved for invertible systems on compact metric spaces, highlights how entropy quantifies the incompressibility of orbit data in information-theoretic terms. These relations have profound implications for chaotic dynamical systems, where positive topological entropy h_{\text{top}}(f) > 0 guarantees the existence of orbits with incompressible descriptions, meaning their grows linearly at rate h_{\text{top}}(f). Such orbits cannot be predicted or compressed beyond this rate by any algorithm, reflecting the intrinsic unpredictability and sensitivity to initial conditions that define in information-theoretic frameworks. This perspective bridges with , showing that manifests as algorithmic randomness along certain trajectories. Modern extensions of Brudno's theorem, developed since 2000, have broadened these connections to quantum and generalized dynamical settings. In quantum information theory, a quantum Brudno theorem equates the rate of a quantum process to the average quantum per step for typical states, using notions of quantum program length based on circuit models. For instance, in quantum dynamical systems, the complexity growth aligns with entropy rates under faithful states, preserving the incompressibility insight for . Additionally, generalizations to actions of amenable groups on spaces demonstrate that entropy equals the density almost everywhere with respect to ergodic measures, extending the classical results to higher-dimensional and non-invertible dynamics. These advancements underscore the robustness of entropy-complexity links in emerging areas like and multiparameter systems.

References

  1. [1]
    None
    ### Definition of Topological Entropy and Key Properties
  2. [2]
    [PDF] fifty years of entropy in dynamics: 1958–2007
    Topological entropy is a precise numerical measure of global exponential com- plexity in the orbit structure of a topological dynamical system. In a variety of ...
  3. [3]
    None
    ### Definition of Topological Entropy by Bowen
  4. [4]
    Topological Entropy - jstor
    The purpose of this work is to introduce the notion of en- tropy as an invariant for continuous mappings. 1. Definitions and general properties. Let X be a ...
  5. [5]
    Topological Entropy for Noncompact Sets - jstor
    TOPOLOGICAL ENTROPY FOR NONCOMPACT SETS. BY. RUFUS BOWEN(1). ABSTRACT. For f: X -. X continuous and Y C X a topological entropy h(f, Y) is defined. For X ...
  6. [6]
    E. I. Dinaburg, “A correlation between topological entropy and metric ...
    \by E.~I.~Dinaburg \paper A correlation between topological entropy and metric entropy \jour Dokl. Akad. Nauk SSSR \yr 1970 \vol 190 \issue 1 \pages 19--22 \ ...
  7. [7]
    Entropy for Group Endomorphisms and Homogeneous Spaces - jstor
    ENTROPY FOR GROUP ENDOMORPHISMS AND. HOMOGENEOUS SPACES. BY. RUFUS BOWEN. Abstract. Topological entropy hd(T) is defined for a uniformly continuous map on a ...
  8. [8]
    On the topological entropy of saturated sets
    On the topological entropy of saturated sets. C.-E. PFISTER† and W. G. SULLIVAN‡. † ´Ecole polytechnique fédérale de Lausanne, Institut d'analyse et calcul ...
  9. [9]
    [PDF] Metric Entropy and Topological Entropy: The Variational Principle
    The Variational Principle for the entropy gives a precise relation between the metric entropy and the topological entropy.
  10. [10]
    [PDF] Specification and the measure of maximal entropy
    Jun 22, 2020 · A measure achieving the supremum is a measure of maximal entropy (MME). Ln. Definition 2 X has specification if there is τ ∈ N0 such that for ...
  11. [11]
    [PDF] Metric Entropy of Dynamical System - Math (Princeton)
    Motivated by all these ideas, Kolmogorov proposed the no- tion of entropy about which it was believed that it will allow to distinguish “probabilistic”.Missing: original | Show results with:original
  12. [12]
    Relating Topological Entropy and Measure Entropy
    T. N. T. GOODMAN. 1. Introduction. In [1] the notion of topological entropy ... RELATING TOPOLOGICAL ENTROPY AND MEASURE ENTROPY. 179. Now assume 0 is a ...
  13. [13]
    [PDF] Entropy in Dynamical Systems & Ergodic Theory (A little Glimpse)
    Can't be true in general : for instance, all circle rotations have entropy 0 but an irrational rotation (which is ergodic) cannot be ergodically equiva-.<|control11|><|separator|>
  14. [14]
    Algorithmic complexity of points in dynamical systems
    Sep 19, 2008 · We examine the algorithmic complexity (in the sense of Kolmogorov and Chaitin) of the orbits of points in dynamical systems. Extending a theorem ...
  15. [15]
    [quant-ph/0506080] Entropy and Quantum Kolmogorov Complexity
    Oct 17, 2006 · Abstract: In classical information theory, entropy rate and Kolmogorov complexity per symbol are related by a theorem of Brudno.Missing: topological | Show results with:topological
  16. [16]
    Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno's ...
    May 11, 2006 · In this paper, we prove a quantum version of this theorem, connecting the von Neumann entropy rate and two notions of quantum Kolmogorov ...
  17. [17]
    Kolmogorov complexity and entropy of amenable group actions - arXiv
    Sep 5, 2018 · It was proved by Brudno that entropy and Kolmogorov complexity for dynamical systems are tightly related. We generalize his results to the case of arbitrary ...Missing: Brudno's theorem<|control11|><|separator|>