Fact-checked by Grok 2 weeks ago

Balance equation

In and stochastic processes, a balance equation describes the condition under which the probability distribution of a Markov chain remains stationary over time. For a with finite state space and P, the global balance equations are given by \pi P = \pi, where \pi is the row vector of stationary probabilities satisfying \sum_i \pi_i = 1 and \pi_i \geq 0 for all i. This equation ensures that, in , the total probability flow into each state equals the flow out. These equations arise from the principle that the chain's long-run behavior stabilizes, with applications in modeling systems like queues and networks. In continuous-time Markov chains, balance equations take the form \pi Q = 0, where Q is the infinitesimal generator matrix, balancing rates of transitions. A stronger condition, , requires \pi_i p_{ij} = \pi_j p_{ji} for all states i, j, implying the chain is reversible. Balance equations are fundamental for solving equilibrium distributions and analyzing system performance, extending to local balance in structured chains and product-form solutions in complex models.

Introduction

Definition and Context

In stochastic processes, particularly continuous-time Markov chains, balance equations provide a foundational framework for determining the steady-state behavior of a system. At their core, these equations stipulate that, in equilibrium, the total probability flux entering a given state equals the total probability flux leaving that state, ensuring no net change in the probability distribution over time. This principle arises from the conservation of probability mass, where the system's probabilities must sum to one at all times, analogous to mass balance equations in physics and chemical kinetics that govern the flow of particles or energy in closed systems. Within the context of stochastic modeling, balance equations are primarily applied to irreducible and positive recurrent Markov chains to solve for the unique \pi, which satisfies the condition \pi Q = 0, where Q is the infinitesimal generator matrix encoding the transition rates between states. This equation captures the long-term equilibrium probabilities, assuming the chain is non-explosive and converges to a regardless of the initial state. Such models are ubiquitous in fields like , reliability analysis, and , where they quantify steady-state occupancies under continuous transitions. Balance equations manifest in several forms, each offering distinct insights into equilibrium conditions without altering the underlying conservation principle. Global balance equates the aggregate inflow and outflow across all states for each individual state, providing a comprehensive view of system-wide stability. Detailed balance, a stronger condition, requires pairwise equality of fluxes between specific state pairs, often implying time-reversibility in the process. Local balance focuses on equilibrium at the level of individual transitions or subsystems, facilitating analysis in complex networks like tandem queues. These variants enable tailored applications while all deriving from the same probabilistic steady-state requirement.

Historical Background

The balance equation concept originated in the field of , where it described conditions for in physical systems. introduced the principle of in his paper "Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen," applying it to gas molecules to explain in collisions, ensuring that forward and reverse processes occur at equal rates at . This foundational idea linked microscopic to macroscopic , forming the basis for later probabilistic interpretations. The transition to occurred in the early with the development of theory. established the framework for stochastic processes in his 1906 work "Rasprostranenie zakona bol'shikh chisel na velichiny, zavisimye drug ot druga," and subsequent papers through 1913, where global balance equations emerged as the standard steady-state conditions for finding invariant distributions in chains. In 1936, advanced this by establishing in "Zur Theorie der Markoffschen Ketten" the criterion for reversibility of s, which is equivalent to with respect to the , connecting physical reversibility to probabilistic stationarity. Further refinements appeared in during the mid-20th century. Peter Whittle formulated local balance in 1968 within his analysis of queueing systems, as detailed in "Equilibrium Distributions for an Open Migration Process," to derive product-form solutions for networks by balancing flows at subsets of states rather than globally. In modern extensions, Erol Gelenbe's 1993 introduction of G-networks in "G-Networks by Triggered Customer Movement" demonstrated product-form stationary distributions without requiring local balance, incorporating negative customers to model complex interactions. Additionally, following the 1953 Metropolis algorithm, balance equations became integral to methods, enforcing to ensure and convergence to target distributions.

Fundamentals of Markov Chains

Continuous-Time Markov Chains

A (CTMC) is a \{X(t): t \geq 0\} with a countable space S, satisfying the : for any t \geq 0 and s > 0, the conditional distribution of X(t+s) given the history up to time t depends only on X(t). The holding time in any i \in S follows an with rate \nu_i = -q_{ii}, where q_{ii} is the diagonal entry of the rate matrix, and upon departure from i, the process jumps to a different j \neq i with probability q_{ij}/\nu_i. The dynamics of a CTMC are governed by its infinitesimal Q = (q_{ij}), a where off-diagonal entries q_{ij} \geq 0 for i \neq j represent rates from i to j, and diagonal entries satisfy q_{ii} = -\sum_{j \neq i} q_{ij}, ensuring that each row sums to zero. This encapsulates the instantaneous behavior, with the process remaining conservative (no mass loss) due to the row-sum property. The transition probabilities P(t) = (p_{ij}(t)), where p_{ij}(t) = \Pr(X(t+s) = j \mid X(s) = i), evolve according to the Chapman-Kolmogorov equations, which for CTMCs take the P'(t) = P(t) Q with P(0) = I, the . This forward Kolmogorov equation describes the time derivative of the , linking the directly to the process evolution. For balance equations to apply meaningfully, the CTMC must be irreducible—meaning every state is reachable from every other—and positive recurrent, ensuring the existence of a unique that the process approaches as t \to \infty. These conditions guarantee long-term stability without transient or null recurrent behavior.

Equilibrium Distributions

In Markov chains, the equilibrium distribution, also known as the , is a \pi = (\pi_1, \pi_2, \dots, \pi_n) that remains unchanged under the chain's dynamics. For a (DTMC) with P, \pi satisfies \pi P = \pi, where each \pi_i \geq 0 represents the long-run probability of being in i, and the \sum_i \pi_i = 1 ensures it is a valid probability distribution. For a continuous-time Markov chain (CTMC) with infinitesimal generator matrix Q—whose off-diagonal entries are transition rates and diagonal entries are the negatives of the row sums—\pi satisfies \pi Q = 0 along with the same \sum_i \pi_i = 1. The for a CTMC arises from the , or Kolmogorov forward equation, which governs the time evolution of the \mathbf{p}(t): \frac{d\mathbf{p}(t)}{dt} = \mathbf{p}(t) Q. At equilibrium, the system reaches a where \frac{d\mathbf{p}(t)}{dt} = 0, implying \pi Q = 0. This condition balances the total probability inflow to each from all other states with the total outflow from that , reflecting a where probabilities are conserved without net change over time. Under the assumptions of irreducibility (every state is reachable from every other) and positive recurrence (expected return times to any state are finite), the \pi is unique for both DTMCs and CTMCs. For finite-state spaces, irreducibility alone guarantees positive recurrence, ensuring a unique \pi > 0 (componentwise positive). In infinite-state spaces, additional conditions like aperiodicity may be needed for convergence to \pi. Computing \pi involves solving the \pi (P - I) = 0 for DTMCs or \pi Q = 0 for CTMCs subject to , which becomes computationally challenging for large state spaces due to the high dimensionality and ill-conditioning of the matrices; iterative methods like or structured approximations are often employed to address scalability. For ergodic Markov chains—those that are irreducible, positive recurrent, and aperiodic—the stationary distribution \pi equals the long-run time-average proportions of time spent in each state, regardless of the initial distribution. This ergodic theorem links the theoretical \pi to observable empirical frequencies in simulations or real systems, providing a foundation for estimating equilibrium behavior from trajectories.

Types of Balance Equations

Global Balance

In continuous-time Markov chains (CTMCs), the global balance equations represent the steady-state condition where, for each state i, the total probability flux leaving state i equals the total probability flux entering it from all other states. This is expressed as \pi_i \sum_{j \neq i} q_{ij} = \sum_{j \neq i} \pi_j q_{ji}, where \pi_i is the stationary probability of state i, and q_{ij} denotes the transition rate from state i to state j (with q_{ii} = -\sum_{j \neq i} q_{ij}). In matrix notation, these equations are compactly written as \pi Q = 0, where \pi is the row vector of probabilities and Q is the with off-diagonal entries q_{ij} (for i \neq j) and diagonal entries q_{ii}. Since the rows of Q sum to zero, the system consists of |S| equations, but one is redundant due to linear dependence, reducing the independent equations to |S| - 1 when combined with the \sum_i \pi_i = 1. For discrete-time Markov chains (DTMCs), the global balance equations similarly equate the total probability outflow and inflow for each state i: \pi_i \sum_{j \neq i} p_{ij} = \sum_{j \neq i} \pi_j p_{ji}, where p_{ij} is the one-step transition probability from state i to j. Equivalently, in matrix form, \pi P = \pi, where P is the transition matrix, again yielding |S| - 1 independent equations under normalization. Solving these linear equations becomes computationally challenging for large state spaces, such as those in complex queueing models, where the number of states grows exponentially, leading to state space explosion and requiring techniques like or approximations to make the system tractable. The global balance equations hold for any ergodic Markov chain—meaning irreducible and positive recurrent—without needing further structural assumptions, ensuring a unique exists and is given by the solution to these equations.

Detailed Balance

Detailed balance is a stronger condition than global balance in Markov chains, requiring that the probability flux between every pair of distinct states i and j is equal in both directions under the \pi. For a (CTMC) with infinitesimal generator Q = (q_{ij}), where q_{ij} denotes the transition rate from state i to state j, detailed balance holds if \pi_i q_{ij} = \pi_j q_{ji} for all i \neq j. Similarly, for a (DTMC) with P = (p_{ij}), the condition is \pi_i p_{ij} = \pi_j p_{ji} for all i \neq j. A chain satisfying detailed balance with respect to \pi is called reversible, as the reversed process has the same transition probabilities. If detailed balance is satisfied, the stationary distribution \pi automatically satisfies the global balance equations, since summing the detailed balance conditions over all j \neq i yields the net flux into and out of state i as zero. This property simplifies the computation of \pi in certain structured chains. For birth-death processes, a subclass of CTMCs where transitions occur only between adjacent states, detailed balance enables an explicit product-form solution for the : \pi_i = \pi_0 \prod_{k=1}^i \frac{q_{k-1,k}}{q_{k,k-1}} for i \geq 1, where \pi_0 is a ensuring \sum_i \pi_i = 1, and q_{k-1,k} (resp., q_{k,k-1}) is the birth (resp., death) rate at state k-1 (resp., k). Kolmogorov's criterion provides a necessary and sufficient condition for reversibility: an irreducible is reversible with respect to some \pi if and only if, for every of states i_1, i_2, \dots, i_n, i_1, the product of transition probabilities (or rates) around the cycle equals the product in the reverse direction, i.e., \prod_{k=1}^n p_{i_k i_{k+1}} = \prod_{k=1}^n p_{i_{k+1} i_k} (with i_{n+1} = i_1). This criterion, originally established for DTMCs and extendable to CTMCs via embedded chains, confirms that holds globally if cycle probabilities balance. Reversible chains arise naturally in symmetric systems, such as the simple on an undirected , where the transition probability from i to adjacent j is p_{ij} = 1 / [d_i](/page/Degree) (with d_i the of i), and the stationary distribution is \pi_i = d_i / (2m) (with m the number of edges); this satisfies due to the 's . However, not all Markov chains in applications satisfy ; for instance, Jackson queueing networks, which model multiple service stations with Markovian routing, are generally non-reversible despite having a product-form , requiring global balance for analysis rather than pairwise flux equality.

Local Balance

Local balance in queueing networks refers to a subset of the detailed balance equations that hold for specific groups of transitions, enabling cancellation of terms within defined routing subsets and facilitating the derivation of product-form stationary distributions without requiring full pairwise reversibility across all states. Specifically, for transitions from state α to state β, the condition takes the form π(α) λ_α = π(β) μ_β, where π denotes the stationary probability, λ_α is the arrival rate at α, and μ_β is the service rate at β. This concept was introduced by Whittle in his 1968 analysis of interacting , where local balance across network "cuts" or subsets decomposes the global balance equations, simplifying the computation of equilibrium distributions in without necessitating complete . In Jackson networks, comprising open with arrivals, service times, and Markovian routing, local balance manifests as flow equilibrium at each individual node, ensuring that the inflow rate equals the outflow rate for each customer class. This node-wise yields the product-form stationary distribution π(n) = \prod_k π_k(n_k), where n represents the of queue lengths and π_k is the at queue k, akin to an independent M/M/1 queue with effective arrival rate. Although local balance often supports product-form solutions, it is not a strict requirement, as demonstrated by Gelenbe's 1993 extension of G-networks incorporating triggered positive customer removal, which achieves product form through batch departures without satisfying traditional local balance conditions. Local balance applies effectively in open networks featuring Markovian probabilities and independent external arrivals, but it generally does not hold in closed networks with fixed sizes or in systems with non-Markovian or dependencies.

Properties and Implications

Reversibility

In continuous-time Markov chains, reversibility is the property that the time-reversed has the same distribution as the original when the chain is in equilibrium with its \pi. The reversed \{X'(t)\} is constructed with transition rates q'_{ji} = \frac{\pi_i}{\pi_j} q_{ij} for i \neq j, where q_{ij} are the original rates, and the chain is reversible if \{X'(t)\} is stochastically identical to the forward \{X(t)\}. Reversibility with respect to the \pi holds if and only if the chain satisfies , meaning \pi_i q_{ij} = \pi_j q_{ji} for all states i, j. This equivalence implies that the forward and reverse flows between any pair of states are balanced in . Reversible chains exhibit a symmetric structure in their when edges are weighted by the stationary probabilities, such that the matrix with entries \sqrt{\pi_i} q_{ij} \sqrt{\pi_j} is symmetric, reflecting the underlying time-reversal invariance. In such chains, time averages of state occupancies or functions along sample paths equal ensemble averages over the , due to the indistinguishability of forward and backward trajectories. To test for reversibility, one can verify the detailed balance equations directly using the known or computed stationary distribution \pi. Equivalently, the Kolmogorov cycle condition serves as a necessary and sufficient criterion: for every cycle of distinct states i_1, i_2, \dots, i_n, i_1, the product of forward transition rates \prod_{k=1}^n q_{i_k i_{k+1}} equals the product of backward rates \prod_{k=1}^n q_{i_{k+1} i_k}. This property aids simulations, such as in (MCMC) methods, where reversibility via detailed balance ensures the chain targets the correct and improves mixing efficiency. Non-reversible chains arise in cases like directed with asymmetric rates, where the Kolmogorov cycle condition fails—for example, a three-state with unequal forward and backward rate products violates the required for time-reversal equivalence.

Product-Form Solutions

In queueing networks and multi-component Markov processes, product-form solutions arise when the of the joint state satisfies a separability condition derived from balance equations. Specifically, the probability vector \pi takes the form \pi = \bigotimes_{k=1}^M \pi_k, where each \pi_k is the for subsystem k and depends only on the parameters of that subsystem, such as arrival rates or service disciplines. This factorization simplifies the analysis of complex systems by treating components as effectively independent in , despite potential interactions through shared resources. Such product-form distributions emerge under certain balance conditions, particularly detailed or local balance, in networks satisfying the BCMP theorem. The BCMP networks, introduced by in 1975, encompass open, closed, and mixed queueing systems where customers follow routing probabilities and queues operate under compatible disciplines like first-come-first-served with exponential service or processor sharing. In these settings, the global balance equations, when combined with local balance at each node—ensuring that the flow into and out of states at a single queue balances independently of the rest of the network—lead to the product form. The theorem guarantees this property for multiple customer classes as long as traffic equations (relating arrival rates to routing) are satisfied. A sketch of the derivation begins with the local balance condition across nodes: for each queue k, the rate of transitions into a state configuration at k equals the rate out, marginalized over other queues. Under the traffic equations, which solve for effective arrival rates \lambda_k = \sum_i \lambda_i p_{ik} where p_{ik} is the routing probability from queue i to k, this implies that the joint stationary distribution factors into independent marginals \pi_k(n_k) = \frac{( \rho_k )^{n_k}}{Z_k} for appropriate utilization \rho_k = \lambda_k / \mu_k and normalization Z_k, yielding the overall product form. This holds in reversible networks but extends to non-reversible cases via the BCMP conditions. Extensions to closed networks, where the total number of customers is fixed, follow the Gordon-Newell theorem from , which establishes product-form solutions for closed BCMP-like systems under similar balance assumptions, with normalization via the trace of the product measure. However, product forms fail in non-Markovian settings, such as those with general service times or state-dependent rates that violate local balance, leading to intractable joint distributions. Computationally, these solutions reduce the problem dimensionality from exponential in the total state space (e.g., O(N^M) for M queues with capacity N) to linear in the subsystems, enabling efficient evaluation of performance measures like mean lengths via iterative algorithms.

Applications

In Queueing Theory

In queueing theory, balance equations are fundamental for analyzing the steady-state behavior of systems modeling service facilities, such as call centers, computer networks, and manufacturing lines, where customers arrive, wait, and depart. These equations, derived from the global or conditions of underlying continuous-time Markov chains, enable the computation of distributions that quantify lengths, waiting times, and throughput. By solving these equations, analysts can evaluate performance under varying loads, ensuring stability when the traffic intensity ρ < 1 across nodes. A key application is in open Jackson networks, which consist of multiple nodes connected by probabilistic routing, with Poisson arrivals and exponential service times at each Markovian queue. In these networks, the global balance equations yield a product-form stationary distribution, decoupling the joint probabilities into independent marginals for each node. For a network of single-server (M/M/1) queues, the stationary probability of state vector n = (n_1, n_2, ..., n_K) is given by \pi(\mathbf{n}) = \prod_{k=1}^K (1 - \rho_k) \rho_k^{n_k}, where ρ_k = λ_k / μ_k is the traffic intensity at node k, λ_k the effective arrival rate, and μ_k the service rate; this form holds provided the network is acyclic or the routing matrix is irreducible with spectral radius less than 1. This result, established through solving the infinite system of s, allows exact performance analysis without enumerating all states. The M/M/s queue, a birth-death process modeling multi-server facilities with Poisson arrivals at rate λ and exponential service at rate μ per server, illustrates detailed balance directly. Here, the balance equations equate the flow between adjacent states: for n ≤ s, λ π_{n-1} = n μ π_n, and for n > s, λ π_{n-1} = s μ π_n. Solving recursively yields the \pi_n = \pi_0 \frac{(s \rho)^n}{n!} \quad (n \leq s), \quad \pi_n = \pi_0 \frac{(s \rho)^n}{s! \, s^{n-s}} \quad (n > s), with ρ = λ / (s μ) < 1 and π_0 = \left[ \sum_{n=0}^s \frac{(s \rho)^n}{n!} + \frac{(s \rho)^s \rho}{s! (1 - \rho)} \right]^{-1}; this explicit facilitates quick computation of blocking probabilities and . For closed networks, where a fixed number K of customers circulates without external arrivals or departures, the Gordon-Newell theorem applies to Markovian systems with exponential servers. Local balance equations at each node lead to a product-form π(n) ∝ ∏k f_k(n_k; v_k), where f_k is the marginal form (e.g., for infinite servers or geometric for M/M/1), and v_k the relative visit rates; the normalization constant G(K, v) is computed via as G(K, v) = ∑{n: |n|=K} ∏_k f_k(n_k; v_k), often using iterative methods for large K. This approach addresses the finite-state constraint, enabling analysis of resource-constrained systems like database transactions. In large-scale queueing networks, exact balance-based solutions become intractable due to state-space explosion, prompting mean-field approximations that treat nodes as interacting particles in the limit of infinitely many queues. These approximations replace with deterministic fluid equations or equations capturing average behavior, such as occupancy fractions evolving via drift terms proportional to arrival-service imbalances; for instance, in random-access networks, the fixed-point solution of the mean-field ODE yields asymptotic queue-length distributions with error O(1/) for queues. Such methods bridge classical product forms to modern scalable analyses, as in power-of-d choices routing. Performance metrics in these models are derived from the stationary distribution π using balance-derived relations like , which states that the average number of customers L = λ W, where λ is the arrival rate and W the mean sojourn time. For example, in an M/M/1 queue, L = ∑_{n=0}^∞ n π_n = ρ / (1 - ρ), directly linking balance solutions to operational insights like expected queue length under load ρ. This integration ensures balance equations inform practical design, such as to minimize delays.

In Chemical Kinetics and Other Systems

In chemical kinetics, balance equations underpin the master equation, which describes the time evolution of probability distributions for molecular states in reacting systems. The master equation for a system of chemical reactions is a set of coupled differential equations governing the populations of species, where transition rates between states reflect reaction kinetics. For reversible reactions satisfying detailed balance, the condition π_i k_{i→j} = π_j k_{j→i} holds at equilibrium, ensuring that the forward and reverse fluxes between each pair of states are equal. A simple example is the isomerization A ⇌ B, where the steady-state probabilities π_A and π_B satisfy π_A k_{A→B} = π_B k_{B→A}, leading to the Boltzmann distribution π_i ∝ exp(-E_i / kT), with E_i as the energy of state i and kT the thermal energy. This framework extends to complex networks, where detailed balance implies a unique equilibrium distribution, often a product of Poisson distributions for particle numbers, enabling gradient flow formulations driven by relative entropy minimization. In , balance equations model immigration-emigration processes as continuous-time birth-death chains, where births represent or and deaths or mortality. Global balance equations equate the total inflow and outflow probabilities for each n, yielding steady-state distributions p_n^* that satisfy ∑j p_j^* q{j n} = p_n^* ∑j q{n j} for transition rates q. For linear rates with constant ν and per-capita death μ, the steady-state follows a with mean ν / μ, reflecting stable equilibria in models like . These chains approximate deterministic logistic growth in large populations, with global balance providing the for persistence versus . Physics applications of balance equations include the Ehrenfest urn model, a simulating between two chambers containing N particles. In this model, a particle is selected uniformly and moved to the opposite with fixed probability, satisfying through symmetric transition rates between states differing by one particle, resulting in a stationary distribution P(k) = \binom{N}{k} (1/2)^N, where k is the number in one urn. This illustrates reversible processes approaching . In , breaks down in driven systems like molecular motors, where arises from unbalanced fluxes, quantified by the Kullback-Leibler divergence between forward and reversed trajectories, enabling inference of even without observable currents. Other systems leverage balance equations for sampling and modeling. In (MCMC) methods, ensures the target π is invariant under transitions, with the Metropolis-Hastings acceptance ratio min(1, [π(y') q(y|y')] / [π(y) q(y'|y)]) balancing fluxes for asymmetric proposals, guaranteeing convergence to correct posteriors in . In epidemiology, stochastic SIR models approximate balance in endemic equilibria, where global balance equates infection and recovery rates across compartments, though full is often violated in cyclic SIRS variants due to irreversible transitions. Emerging applications include machine learning, where balance equations inform physics-aware diffusion models by enforcing conservation laws like Fick's law in neural network loss functions, deriving parabolic PDEs ∂ρ/∂t = D ∇²ρ for density ρ and diffusion coefficient D to generate physically consistent samples. In climate modeling, energy balance models solve global radiative equilibria, such as C dT/dt + A + B T = Q (1 - α), balancing absorbed solar flux Q(1 - albedo α) against outgoing longwave radiation, with parameters A ≈ 203 W/m² and B ≈ 2 W/m²°C capturing feedbacks like ice-albedo for sensitivity analysis. These often require numerical solutions for spatially resolved systems.

References

  1. [1]
    None
    ### Definition, Form, and Key Concepts of the General Balance Equation
  2. [2]
    [PDF] Processes General balance equation - Chemical Engineering
    This is the most general balance equation that can be written to balance quantities like mass, energy, momentum etc. Each term in the balance equation has ...
  3. [3]
    Total Mass Balance Equation - an overview | ScienceDirect Topics
    4.3.1 General balance equation. One of the governing concepts in any separation process is the principle of mass/matter conservation, which states that the mass ...
  4. [4]
    Material Balance Equation in Reservoir Engineering‎
    Mar 20, 2016 · Material Balance Equation in Reservoir Engineering · F = N×Et + We · #1 form: · #2 form:.
  5. [5]
    [PDF] Markov Chains - CAPE
    This book is an account of the elementary theory of Markov chains, with applications. It was conceived as a text for advanced undergraduates or master's level ...
  6. [6]
    [PDF] 5 Markov Chain Models - Taras I.Lakoba - University Of Vermont
    The idea of these balance equations is, roughly speaking, this: the inflow into a given state of the system must equal the outflow from this state. What flows ...
  7. [7]
    [PDF] 1 IEOR 6711: Continuous-Time Markov Chains
    Definition 1.1 A stochastic process {X(t) : t ≥ 0} with discrete state space S is called a continuous-time Markvov chain (CTMC) if for all t ≥ 0, s ≥ 0, i ∈ S, ...
  8. [8]
    [PDF] CONTINUOUS-TIME MARKOV CHAINS Definition 1. Acontinuous ...
    Kolmogorov Equations. Definition 2. Theinfinitesimal generator (also called theQmatrix) of a continuous-time Markov chain is the matrix Q = (qx,y ) ...Missing: infinitesimal | Show results with:infinitesimal
  9. [9]
    [PDF] CONTINUOUS TIME MARKOV CHAINS 1. Introduction Discrete-time ...
    Kolmogorov's Forward and Backward Equations. Definition 5.1. The infinitesimal generator (also called theQ−matrix) of a continuous- time Markov chain is the ...
  10. [10]
    [PDF] 5 Continuous-Time Markov Chains - TTU Math
    Chapman-Kolmogorov equations pji (t + ∆t) = ∞. X k=0 pjk(∆t)pki (t) ... backward Kolmogorov equations. ▷ Assume the state space of a finite Markov chain ...
  11. [11]
    [PDF] Continuous-time Markov Chains - University of Rochester
    Oct 31, 2016 · Consider irreducible, positive recurrent CTMC with transition rates νi and ... ▻ Continuous-time Markov chain. ▻ Markov property. ▻ Time ...
  12. [12]
    [PDF] 1 Limiting distribution for a Markov chain
    Theorem 2.2 Every irreducible Markov chain with a finite state space is positive recurrent and thus has a stationary distribution (unique probability solution ...
  13. [13]
    [PDF] Stationary distributions and limiting probabilities
    Def 0.1. π is called a stationary (or equilibrium) distribution of the. Markov chain if it satisfies. π = πP, (π is a left eigenvector corresponding to 1)
  14. [14]
    [PDF] Chapter 1 CONTINUOUS TIME MARKOV CHAIN MODELS FOR ...
    A reaction network is modeled as a continuous time Markov chain where the state is the number of molecules of each species, and reactions are transitions.
  15. [15]
    [PDF] 1 Continuous Time Processes - 1.1 Continuous Time Markov Chains
    May 1, 2011 · Find the infinitesimal generator of this Markov process. 3. Find its stationary distribution by making reasonable assumption on. µj's and α < 1.
  16. [16]
    [PDF] MARKOV CHAINS: BASIC THEORY 1.1. Definition and First ...
    Markov chain is positive recurrent and irreducible, then there is a unique stationary probability distribution π, and for all states x,y , with Py ...
  17. [17]
    [PDF] Notes 22 : Markov chains: stationary measures
    1.4 Stationary distributions and positive recurrence. So far, we have focused on the existence and uniqueness of a stationary measure. Here we address these ...
  18. [18]
    [PDF] Red Light Green Light Method for Solving Large Markov Chains
    Sep 7, 2022 · These large Markov chains require computation methods with low complexity, small storage requirements, and distributed implementation. Few ...
  19. [19]
    Computing stationary distribution locally - DSpace@MIT
    In this thesis, we set out to address this outstanding challenge of computing the stationary probability of a given state in a Markov chain locally, efficiently ...
  20. [20]
    [PDF] 25 Ergodicity for Finite-State Discrete-Time Markov Chains
    For such chains, the stationary distribution represents the long-run time-average proportion of time spent in each state, that is, the pj's. Very roughly ...
  21. [21]
    [PDF] Lecture 10 Stationary and Limiting Distributions
    Given a simple graph G, let the random walk on G be the Markov chain defined as follows: 1. The set of states S is the set of vertices V of the graph G. 2 ...<|control11|><|separator|>
  22. [22]
    [PDF] Continuous-time Markov Chains (CTMC)
    Jan 6, 2012 · The equations (6.1. 10) is referred to as the (global) balance equations and states that in equi- librium the total probability flux out of ...Missing: sum | Show results with:sum
  23. [23]
    [PDF] CONTINUOUS-TIME MARKOV CHAINS - Columbia University
    Dec 4, 2013 · First, we observe that the global-balance equations (flow into state j equals flow out of state j), captured by the equation αQ = 0, can be ...
  24. [24]
    [PDF] Markov chains
    The two theorems above tell us that irreducible finite-state Markov chains have ... These are called the global balance equations for the Markov chain.
  25. [25]
    [PDF] Lecture 4 — Tackling State Space Explosion
    Mar 6, 2013 · usual way by solving the global balance equations. The solution gives you the probability of being in the set of states that have the same ...
  26. [26]
    [PDF] Markov Chains and Markov Chain Monte Carlo
    Detailed balance: the flow of probability mass between each pair of states is balanced: • A Markov chain satisfying detailed balance is called reversible.
  27. [27]
    [PDF] EM509: Individual Project Birth-death Process
    The equations (2) are called detailed balance equations, which hold between every pair of adjacent states; i − 1 and i of the BD process for all i = 1, 2, 3 ...
  28. [28]
    A Tutorial on the Spectral Theory of Markov Chains - MIT Press Direct
    Oct 10, 2023 · Thus, for a reversible Markov chain in one of its stationary distributions, the flow from one state to another is completely balanced by the ...
  29. [29]
    [PDF] Reversible Markov Chains and Random Walks on Graphs
    This monograph covers reversible Markov chains and random walks on graphs, including word problems, random knight moves, and general Markov chains.
  30. [30]
    [PDF] Networks of queues - University of Bristol
    unless the Markov chain is reversible. Unfortunately, not all Markov chains of interest are reversible! However, there is a class of queueing networks.
  31. [31]
    G-networks by triggered customer movement
    Gelenbe, Erol 1993. G-Networks with Signals and Batch Removal. Probability in the Engineering and Informational Sciences, Vol. 7, Issue. 3, p. 335. CrossRef ...
  32. [32]
    19. Time Reversal in Continuous-Time Chains - Random Services
    First, the Markov property stated in the form that the past and future are independent given the present, essentially treats the past and future symmetrically.
  33. [33]
    [PDF] Chapter 3 Reversible Markov Chains David Aldous Department of ...
    Sep 10, 2002 · A reversible Markov chain has a stationary distribution where اipij = اjpji for all i,j, meaning the forward and backward movies are ...
  34. [34]
    7. Time Reversal in Discrete-Time Chains - Random Services
    If we have reason to believe that a Markov chain is reversible ... The condition is known as the Kolmogorov cycle condition , and is named for Andrei Kolmogorov.
  35. [35]
    [PDF] Detailed Balance, and Markov Chain Monte Carlo (MCMC) πi
    Detailed balance is an important property of certain Markov Chains that is widely used in physics and statistics. Definition. Let X0,X1,... be a Markov chain ...Missing: book | Show results with:book
  36. [36]
    Networks of Waiting Lines | Operations Research - PubsOnLine
    Optimal service and arrival rates in Jackson queueing networks. 13 January ... Jackson, (1957) Networks of Waiting Lines. Operations Research 5(4):518 ...Missing: JR | Show results with:JR
  37. [37]
    [PDF] LECTURE SUMMARY VII - WINLAB, Rutgers University
    Global balance equations: pjΣ Pji= Σ pi Pij j=0,1,2…. These equations imply that at equilibrium, the frequency of transitions out of j equals that into j.Missing: continuous- | Show results with:continuous-<|control11|><|separator|>
  38. [38]
    Mean-field limits for large-scale random-access networks - arXiv
    Nov 29, 2016 · Abstract:We establish mean-field limits for large-scale random-access networks with buffer dynamics and arbitrary interference graphs.Missing: queueing seminal
  39. [39]
  40. [40]
  41. [41]
  42. [42]
  43. [43]