Fact-checked by Grok 2 weeks ago

Elementary event

In , an elementary event, also called an atomic event or sample point, is the fundamental unit of a , defined as a singleton set containing exactly one outcome from the , which represents all possible results of a random experiment. These events serve as the building blocks for constructing more complex events, which are subsets of the sample space comprising multiple elementary outcomes, allowing for the assignment of probabilities to broader scenarios in accordance with Kolmogorov's axioms. In discrete probability models, such as coin flips or dice rolls, each elementary event typically has an equal probability if the outcomes are equally likely, often denoted as \frac{1}{n} where n is the number of sample points. The concept is essential for defining the sigma-algebra of events and ensuring that probabilities are well-defined, non-negative, and sum to 1 over the entire sample space.

Definition and Context

Core Definition

In , an elementary event is a singleton of the \Omega, which represents the universal set of all possible outcomes in a random experiment, and it corresponds to an indivisible outcome that cannot be decomposed into simpler events. Formally, if \Omega is the , then an elementary event is denoted as \{\omega\} for some \omega \in \Omega. These events act as the atoms of the event space, serving as the basic building blocks from which all other events are formed by taking unions of such singletons. The concept of the elementary event gained prominence through Andrey Kolmogorov's axiomatic formulation of probability in 1933, where he explicitly identified these as the fundamental elements e of the set E (the space of elementary events), distinguishing them from composite random events that are subsets of E.

Relation to Sample Space

In , the sample space, denoted as \Omega, represents the universal set encompassing all possible outcomes of a random experiment, while elementary events correspond to the individual singleton subsets \{\omega\} for each \omega \in \Omega. These elementary events serve as the atomic units, capturing the most basic, indivisible results of the experiment. The structure of events builds upon these elementary events through the formation of a sigma-algebra \mathcal{F} on \Omega, which includes the empty set, \Omega itself, and is closed under countable unions, intersections, and complements. In finite sample spaces, \mathcal{F} is often the full power set of \Omega, consisting of all possible subsets, each of which is a finite union of elementary events. For infinite sample spaces, particularly continuous ones, the sigma-algebra is typically generated by a basis such as open intervals (Borel sigma-algebra), which includes the elementary events as measurable sets, though they are assigned probability zero. Events are measurable sets in this sigma-algebra, not necessarily countable unions of singletons. Regardless of the of \Omega, elementary events retain their indivisible nature, forming the foundational layer from which all composite events are derived. This relational structure underscores the prerequisite role of elementary events in establishing the event algebra of a , enabling the subsequent definition of measurable sets and probability measures.

Probability Assignment

Probability Measure on Elementary Events

In , the assignment of probabilities to elementary events forms the foundational layer of a , where an elementary event is a singleton subset {ω} for some outcome ω in the Ω. The P is defined such that it maps each elementary event to a value in [0,1], ensuring that probabilities are non-negative and collectively normalize to unity across the . This measure extends to more complex events through the principles of additivity, establishing a consistent framework for probabilistic reasoning. The foundation for this stems from Kolmogorov's three , which apply directly to elementary events in discrete settings. Specifically, the first requires that P({ω}) ≥ 0 for every ω ∈ Ω, guaranteeing non-negativity as a core property of valid probabilities. The second enforces by stipulating that the sum of probabilities over all elementary events equals 1, i.e., ∑_{ω ∈ Ω} P({ω}) = 1, which ensures the total probability is conserved. These properties collectively define a valid over the elementary events, serving as the building blocks of the . Furthermore, the third of countable additivity extends the measure to unions of disjoint elementary events. For a countable collection of disjoint elementary events {ω_i}, the probability of their is the sum of their individual probabilities: P\left( \bigcup_i \{\omega_i\} \right) = \sum_i P(\{\omega_i\}) This additivity principle allows the on singletons to propagate to arbitrary events within the sigma-algebra generated by the elementary events, maintaining consistency in both finite and countably cases. Non-negativity and remain , preventing negative or super-unitary probabilities and upholding the integrity of the measure.

Uniform vs. Non-Uniform Cases

In the uniform case, each elementary event \omega \in \Omega in a finite \Omega is assigned equal probability, such that P(\{\omega\}) = \frac{1}{|\Omega|}. This assumption holds for scenarios like tosses, where the \Omega = \{\text{heads}, \text{tails}\} yields P(\{\text{heads}\}) = P(\{\text{tails}\}) = \frac{1}{2}, or standard dice rolls with \Omega = \{1, 2, \dots, 6\} and each face equally likely at \frac{1}{6}. In contrast, the non-uniform case allows probabilities to vary across elementary events, provided they are non-negative and sum to 1 over \Omega. These assignments are modeled using a (PMF), which specifies P(\{\omega\}) for each \omega. For instance, a biased might have P(\{\text{heads}\}) = 0.7 and P(\{\text{tails}\}) = 0.3, reflecting unequal likelihoods due to physical imperfections. For any event E as a union of elementary events in the discrete case, the probability simplifies to P(E) = \frac{|E|}{|\Omega|}, where |E| counts the favorable elementary outcomes. Uniform assignments simplify by reducing probability computations to mere of outcomes, avoiding the need to sum disparate values from a PMF. However, many real-world scenarios, such as biased experiments or weighted sampling, necessitate non-uniform models to accurately capture varying likelihoods among elementary events.

Examples and Illustrations

Discrete Sample Spaces

In discrete sample spaces, which are finite or countably infinite sets of possible outcomes, elementary events correspond to the individual outcomes that form the basic building blocks of the probability model. These spaces allow for the explicit listing of all outcomes, making it straightforward to identify and work with elementary events as the indivisible units from which more complex events are constructed. A classic example is the toss of a , where the \Omega = \{H, T\} consists of two outcomes: heads (H) or tails (T). Here, the elementary events are the singletons \{H\} and \{T\}, each representing an atomic outcome of the experiment. In a probability assignment, the probability of each elementary event is P(\{H\}) = P(\{T\}) = \frac{1}{2}. Another illustrative case is the roll of a fair six-sided die, with \Omega = \{1, 2, 3, 4, 5, 6\}. The elementary events are \{1\}, \{2\}, \dots, \{6\}, each denoting the occurrence of a specific . Under probability, the measure assigned to each is P(\{i\}) = \frac{1}{6} for i = 1, 2, \dots, 6. The nature of these s enables the direct enumeration of elementary events, which in turn facilitates the calculation of probabilities for compound events through simple of the probabilities of the constituent elementary events. This approach is particularly valuable in finite cases, as it provides a method to verify that the total probability sums to 1 across all elementary events.

Continuous Sample Spaces

In continuous sample spaces, the sample space \Omega is uncountable, typically consisting of all real numbers within an or more complex sets, such as \Omega = [0,1] for a representing proportions or normalized times. Here, elementary events are singletons \{x\} for each x \in \Omega, but unlike discrete cases, these have measure zero under the . For the on [0,1], the probability P(\{x\}) = 0 for any specific x, as the total probability mass of 1 is distributed continuously across the interval, making the likelihood of exact points negligible. Probability in such spaces is assigned via probability density functions rather than directly to singletons; meaningful events are intervals or sets with positive length, where P([a,b]) = \int_a^b f(t) \, dt for density f(t) = 1 in the uniform case. Elementary events serve as idealized building blocks, but their zero probability reflects the infinite divisibility of the space, ensuring the axioms of probability are satisfied without assigning positive mass to uncountably many points. A similar structure applies to non-uniform continuous distributions, such as the standard normal distribution with density f(x) = \frac{1}{\sqrt{2\pi}} e^{-x^2/2}, where \Omega = \mathbb{R} and each elementary event \{x\} again satisfies P(\{x\}) = 0, despite the density f(x) providing the rate of probability accumulation around x. Probabilities are computed over intervals, like P(a < X < b) = \int_a^b f(x) \, dx, highlighting how the density informs event likelihoods without contradicting the zero-probability singletons. This zero-probability nature poses a conceptual : elementary events in continuous spaces are not directly observable or practically distinguishable, as real measurements involve intervals due to precision limits. Consequently, shifts emphasis to intervals or Borel sets as the basic measurable units, treating singletons theoretically while focusing empirical analysis on events with positive measure.

Distinctions from Other Concepts

Elementary vs. Compound Events

In , an elementary event consists of exactly one outcome from the , making it the indivisible building block of all probabilistic analyses. These events, often denoted as singletons such as {1} in the context of rolling a die, serve as the atomic units upon which more complex structures are built. A compound event, by contrast, arises from the union of two or more elementary events, allowing it to encompass multiple outcomes. For instance, when rolling a six-sided die, the elementary events include {1}, {2}, {3}, {4}, {5}, and {6}, whereas the compound event of obtaining an even number is the {2, 4, 6}. This construction enables the representation of broader scenarios, such as "success" in a binary trial or "heads or tails" in a flip, which cannot be captured by a single elementary event. The primary distinctions lie in their and properties: elementary events are inherently and mutually exclusive, meaning distinct ones cannot occur simultaneously since they represent unique outcomes with no overlap. Compound events, however, are decomposable into their elementary components and may overlap with other events, permitting intersections or shared outcomes in more intricate probability spaces. This decomposability underscores a key conceptual role, where elementary events furnish the fine-grained resolution required to define and calculate probabilities for compound events through systematic aggregation. When a compound event is the union of disjoint elementary events, its probability equals the sum of the individual probabilities of those components, reflecting the additivity property of probability measures. This relation ensures that probabilities remain consistent and computable, as the total likelihood distributes additively across non-overlapping basic outcomes.

Role in Sigma-Algebras

In the Kolmogorov axiomatic framework of , elementary events play a central role in the construction of sigma-algebras by serving as the foundational measurable sets that generate the collection of all permissible events. For a finite \Omega, the elementary events are the singletons \{\omega\} for each \omega \in \Omega, and the smallest sigma-algebra containing these singletons is the power set $2^\Omega, which includes every possible of elementary events and thus encompasses all subsets of \Omega as measurable events. This generation ensures that every event can be expressed as a of elementary events, providing a complete for probability assignments in discrete models. In continuous sample spaces, such as \Omega = \mathbb{R}, the standard Borel sigma-algebra \mathcal{B}(\mathbb{R}) is employed, which is generated by the open intervals and contains all singletons \{x\} as measurable sets, since each singleton arises as a countable intersection of open intervals centered at x. Although the Borel sigma-algebra is not directly generated by the singletons alone—instead relying on intervals for generation—the elementary events remain measurable and form the atomic level from which more complex Borel sets are built through countable operations, ensuring that point outcomes are always included in the event space. This measurability is crucial, as it allows probability measures like the to assign probabilities (typically zero) to elementary events while defining measures for intervals and their combinations. Beyond basic spaces, elementary events underpin the sigma-algebras in abstract probability constructions, such as product spaces where the product sigma-algebra is generated by cylinder sets that specify outcomes in finitely many coordinates, effectively projecting elementary events from component spaces to form the basis for infinite-dimensional models. In settings like Markov chains, which can be viewed as processes on product spaces, these cylinder sets derived from elementary events enable the definition of measurable spaces and probabilities. Within the standard Kolmogorov framework, elementary events thus consistently serve as the indivisible, measurable units that generate and populate the sigma-algebra, distinguishing classical probability from non-standard models like quantum probability, where events may not correspond to classical singletons due to non-commutative structures.

References

  1. [1]
    [PDF] Introduction to Probability Theory
    An event A refers to any possible subspace of the sample space S, i.e., A ⊆ S, and an elementary event is an event that contains a single sample point s.
  2. [2]
    [PDF] Free Access Elementary Probability For Applications
    In probability theory, an elementary event, also called an atomic event or sample point, is an event which contains only a single outcome in the sample ...
  3. [3]
    [PDF] Math 3338: Probability (Fall 2006) - University of Houston
    Aug 25, 2006 · Elementary event: also called simple event, is an event consisting of exactly one outcome: elementary event = sample point. An event A ...
  4. [4]
    [PDF] Lecture Notes 1 Basic Probability • Set Theory
    • Examples: ◦ Any outcome (sample point) is an event (also called an elementary event),. e.g., {HTH} in three coin flips experiment or {0.35} in the picking ...
  5. [5]
    [PDF] Probability Theory Basics - Course Websites
    Jan 23, 2017 · Events (cont.) • Elementary event is the event {s} consisting of a single sample point. • Bringing the Definitions together. – E is an event ...
  6. [6]
    [PDF] A Short Introduction to Probability Theory
    A probability P(A) is defined for each event A. •. The probability of an elementary event B is P(B)=1/n. •. P(A)=m/n; m- # of points in event A. 7. Properties ...
  7. [7]
    [PDF] 1 States and Events
    Each elementary event or state is intended to be an exhaustive description of exactly one possible outcome; an elementary state of nature is intended to be an ...
  8. [8]
    [PDF] Chapter 2: Probability Theory
    Each member of S is referred to as an elementary event. Now let's define f(S) as the number (“f” for “frequency”) of elementary events in S. In more complex ...
  9. [9]
    Introduction to Probability Measures - Project Euclid
    Any element of Ω, as a singleton, is an elementary event. For instance. {(1,1)} is the elementary event : Face 1 appears in both tossing . In this example ...<|separator|>
  10. [10]
    [PDF] FOUNDATI<lNS THEORY OF PROBABILITY - AltExploit
    elementary events are defined by means of an infinite number of coordinates. Let us take a set M of indices p. (indexing set) of arbitrary cardinality m ...
  11. [11]
    [PDF] Foundations of the theory of probability - Internet Archive
    The theory of probability, as a mathematical discipline, can and should be developed from axioms in exactly the same way as Geometry and Algebra.
  12. [12]
    [PDF] Probability Theory
    Remarks: (1) The sample space Ω is the set of all possible samples or elementary ... Probability axioms of Kolmogorov (1931) for elementary probability: • P(Ω) ...
  13. [13]
    [PDF] MAT 235A / 235B: Probability - UCI Mathematics
    For the measure theory, we'll need a big set Ω, called the sample space. The ... At the opposite extreme let F = P(Ω) the power set of Ω. This is the.
  14. [14]
    [PDF] MATHEMATICAL PROBABILITY THEORY IN A NUTSHELL 1 Contents
    Kolmogorov starts from a set (a space) Ω of elementary events, i.e., simple events. The elements ω of this set are immaterial for the logical development of the ...
  15. [15]
    [PDF] An Introduction to Discrete Probability - CIS UPenn
    Oct 31, 2025 · In 1933, Kolmogorov provided a precise axiomatic ap- proach to probability theory, which made it into a rigorous branch of math- ematics with ...
  16. [16]
    Foundations of the theory of probability : Kolmogorov, A. N
    Apr 26, 2013 · PDF download · download 1 file · SINGLE PAGE ORIGINAL JP2 TAR download · download 1 file · SINGLE PAGE PROCESSED JP2 ZIP download · download 1 ...
  17. [17]
    [PDF] ICS 6A Notes on Discrete Probability 1 Probability Spaces
    In cases where some elementary events are more likely than others, there is a non-uniform distribution over the sample space. Note that if the dice are not ...
  18. [18]
    [PDF] Chapter 2. Probability.
    A probability distribution on a finite sample space Ω = {ω1,...,ωN } is an assignment of probabilities to the elementary outcomes (elements) of Ω.
  19. [19]
    [PDF] Elementary probability notes∗ - Brooklyn College
    Mar 12, 2020 · 2.3.1 Outcomes and elementary events . ... The sample space is Ω = {1, 2, 3, 4, 5, 6}. In this case, the set of ...
  20. [20]
    [PDF] Probability
    ! • Some special events: – Elementary event: event that contains exactly one outcome. – { }: null event. – S: sure event. Page 9. (Discrete) Probability. 9. ▻ ...<|control11|><|separator|>
  21. [21]
    Discrete probability and combinatorics - Mathematics
    In such a case the probability of any event can be written as a (potentially infinite) sum over the probability of elementary events (1) P ( A ) = ∑ a ∈ A P ( { ...
  22. [22]
    [PDF] CSE 21—Mathematics for Algorithm and System Analysis
    For an event E, P(E) = the sum of the probabilities of each elementary event in E. – Probability of rolling better than a 4 is P(5) + P(6) = 2/6.
  23. [23]
    [PDF] Probability I - DSpace@MIT
    Feb 10, 2004 · An event is a set of elementary outcomes. • The probability of an event is the sum of the probabilities of the elementary outcomes. • E.G. ...
  24. [24]
    [PDF] 18.440: Lecture 18 .1in Uniform random variables
    ▷ Probability of any single point is zero. ▷ Define cumulative ... ▷ One approach: let Y be uniform on [0,1] and try to show that. X = (β − α)Y + α ...
  25. [25]
    [PDF] Probability - Arizona Math
    If the sample space Ω is finite with r points, then the uniform probability measure is the discrete probability measure that assigns to each singleton subset.
  26. [26]
    5.1 Introduction to Continuous Random Variables and The Uniform ...
    P(x = c) = 0 The probability that x takes on any single individual value is zero. The area below the curve, above the x-axis, and between x = c and x = c ...
  27. [27]
    Chapter 8 Continuous Random Variables | Introduction to Statistics ...
    The uniform distribution has a probability density of zero for values outside of its range [c,d] .
  28. [28]
    [PDF] Finding Probabilities Section 5.3, Normal Distributions: Finding Values
    Since this is a continuous distribution, the probability of taking an exact value is zero. (We can see this by noticing that P(x = 8) = P(8 ≤ x ≤ 8), so when ...
  29. [29]
    [PDF] Chapter 1. Preliminaries.
    We will refer to simple events as elementary outcomes. A compound event is an event which can be decomposed into two or more events. For example, if we toss a ...
  30. [30]
    [PDF] Chapter 4 Key Ideas Events, Simple Events, Sample Space ... - CSUN
    Compound Event – An event that is comprised of two or more simple events. ... This means the probability that either A happens, or A does not happen. This ...<|control11|><|separator|>
  31. [31]
    [PDF] Basic Probability: Key Definitions and Rules
    Mar 29, 2018 · A compound event is an event that includes two or more simple events. • An event is any collection of one or more simple events in the ...
  32. [32]
    [PDF] Lecture 4: Probability and Discrete Random Variables 1 Counting ...
    Jan 21, 2009 · This lecture reviews elementary combinatorics and probability theory. We begin by first reviewing elementary results in counting theory, ...Missing: non- | Show results with:non-
  33. [33]
    CM Probability
    If an event includes multiple outcomes, it is known as a compound event ; if it consists of exactly one outcome, i.e., it cannot be broken down further, it is ...
  34. [34]
    [PDF] Review of Probability - twister.ou.edu
    In this case, for example, the compound event. "frozen precipitation" would occur if either of the elementary events "frozen pre- cipitation containing at least ...
  35. [35]
    Probability Models
    The probability of the union of disjoint events is the sum of their individual probabilities.
  36. [36]
    The Axioms of Probability
    An event A that has probability one is said to be certain or sure. S is certain. The union of two events, A UB, can be broken up into three disjoint sets: ...
  37. [37]
    [PDF] MATH/STAT 235A — Probability Theory Lecture Notes, Fall 2013
    Dec 30, 2013 · Definition 2.2 (σ-algebra). A σ-algebra (also called a σ-field) ... the σ-algebra generated by these “elementary” events. In this case ...
  38. [38]
    [PDF] 6.436J / 15.085J Fundamentals of Probability, Lecture 2
    We define appropriate algebras, define probabilities for the events in those algebras, and ... This leads us to define F as the sigma-algebra σ(F0) generated by ...
  39. [39]
    [PDF] Quantum Mechanics as a Theory of Probability - arXiv
    Oct 13, 2005 · The non-classical behavior of the probabilities is already forced by a finite number of events and the relations among them. Recall that each ...