Fact-checked by Grok 2 weeks ago

Interaction information

Interaction information is a concept in that extends to quantify the amount of information shared among three or more random variables, capturing dependencies that are not explained by pairwise relationships alone. For three random variables X, Y, and Z, it is formally defined as I(X; Y; Z) = I(X; Y) - I(X; Y \mid Z), where I(X; Y) is the between X and Y, and I(X; Y \mid Z) is their given Z; this can equivalently be expressed in terms of entropies as I(X; Y; Z) = H(X) + H(Y) + H(Z) - H(X,Y) - H(X,Z) - H(Y,Z) + H(X,Y,Z), with H denoting Shannon entropy. Introduced by W. J. McGill in his 1954 paper on multivariate information transmission, the measure differs from by allowing negative values, which arise when conditioning on the third variable increases the apparent dependence between the first two, signaling synergistic interactions. Note that some literature uses a flipped formula, reversing the sign interpretation. Unlike non-negative measures such as or , the potential negativity of interaction information provides a unique tool for distinguishing between and effects in multivariate systems. Positive values indicate , where the third variable reduces the shared between the others, while negative values suggest , where the third variable enhances it. This property has made interaction information particularly valuable in analyzing higher-order dependencies, as it generalizes naturally to n variables through inclusion-exclusion principles, though computation becomes complex for large n due to the need for high-dimensional joint distributions. In applications, interaction information has been employed in to resolve ambiguities in directed acyclic graphs, such as identifying the direction of influences in triangular causal structures where standard fails. For instance, its sign can differentiate between , , and configurations in observational data, aiding fields like for detecting side effects in medical treatments. Beyond causality, it features in for modeling neural interactions, where negative values highlight cooperative firing patterns among neurons, and in complex to decompose information flows in networks, such as communication or ecological models. Ongoing extensions integrate it with partial information decomposition to handle continuous variables and non-linear dependencies, enhancing its utility in for and interpretability.

Fundamentals

Definition

Interaction information is a measure in that extends the concept of to three or more random variables, capturing the synergistic or redundant dependencies among them that go beyond simple pairwise associations. It quantifies how the joint information content of multiple variables interacts, either enhancing or diminishing the shared information compared to what would be expected from independent pairwise mutual informations. This makes it particularly useful for analyzing complex systems where higher-order interactions play a role. The concept was introduced by William J. McGill in 1954 as part of a broader framework for multivariate information transmission, building on Claude Shannon's foundational work in . To understand interaction information, it is helpful to recall prerequisite notions: H(X) measures the or average content of a single X, while I(X;Y) = H(X) + H(Y) - H(X,Y) quantifies the shared information between two variables X and Y, representing the reduction in about one given of the other. For three random variables X, Y, and Z, interaction information conceptually represents the amount of information that Z provides (or removes) regarding the mutual information between X and Y. In other words, it assesses whether Z introduces additional synergy that increases the overall shared information or redundancy that decreases it relative to the pairwise mutual information I(X;Y). This three-way perspective highlights dependencies that pairwise measures alone cannot detect.

Mathematical Formulation

The interaction information for three random variables X, Y, and Z is defined as the mutual information between X and Y minus their conditional mutual information given Z: I(X; Y; Z) = I(X; Y) - I(X; Y \mid Z). This formulation quantifies the change in information shared between X and Y upon conditioning on Z. To derive the entropy expansion, start with the definitions of mutual and conditional mutual information in terms of Shannon entropy H(\cdot): I(X; Y) = H(X) + H(Y) - H(X, Y), I(X; Y \mid Z) = H(X \mid Z) + H(Y \mid Z) - H(X, Y \mid Z). The conditional entropies expand as H(X \mid Z) = H(X, Z) - H(Z), H(Y \mid Z) = H(Y, Z) - H(Z), and H(X, Y \mid Z) = H(X, Y, Z) - H(Z). Substituting these yields: I(X; Y \mid Z) = [H(X, Z) - H(Z)] + [H(Y, Z) - H(Z)] - [H(X, Y, Z) - H(Z)] = H(X, Z) + H(Y, Z) - H(X, Y, Z) - H(Z). Thus, I(X; Y; Z) = H(X) + H(Y) - H(X, Y) - [H(X, Z) + H(Y, Z) - H(X, Y, Z) - H(Z)] = H(X) + H(Y) + H(Z) - H(X, Y) - H(X, Z) - H(Y, Z) + H(X, Y, Z). This expansion expresses the interaction information directly in terms of joint and marginal entropies. The concept generalizes to n random variables V_1, \dots, V_n via the inclusion-exclusion principle applied to their entropies: I(V_1; \dots; V_n) = \sum_{\emptyset \neq T \subseteq \{V_1, \dots, V_n\}} (-1)^{|T| - 1} H\left( \{ V_i \mid i \in T \} \right), where the sum is over all non-empty subsets T and the term H(\cdot) denotes the joint entropy of the variables indexed by T. This alternating sum isolates the n-way interaction beyond lower-order dependencies. The three-variable case relates to as defined and to the total correlation (or multivariate mutual information) C(X; Y; Z) = H(X) + H(Y) + H(Z) - H(X, Y, Z), which decomposes as C(X; Y; Z) = I(X; Y) + I(X; Z) + I(Y; Z) - I(X; Y; Z). This shows how the interaction term adjusts the sum of pairwise s to yield the overall dependence.

Properties

Symmetry and Basic Properties

Interaction information exhibits symmetry under arbitrary permutations of its variables. For three random variables X, Y, and Z, this property is expressed as I(X; Y; Z) = I(Y; Z; X) = I(Z; X; Y), ensuring that the measure remains invariant regardless of the ordering. This symmetry arises from the equivalent formulations of the three-variable expression, as derived in the mathematical formulation section. When variables are partitioned into subsystems, interaction information demonstrates additivity. Specifically, if the system is divided into of variables that are mutually , the overall interaction information decomposes into the of the interaction informations within each subsystem, with cross-subsystem interactions evaluating to zero. This property reflects the absence of higher-order dependencies across components. Interaction information relates closely to Markov properties in probabilistic dependencies. In a structured as X \to Y \to Z, where X and Z are conditionally given Y, the three-way information I(X; Y; Z) = 0. This vanishing value indicates no genuine tripartite interaction beyond the pairwise dependencies. Unlike , which is always non-negative, interaction information does not possess this property and can take negative values. Negative interaction information typically signifies among the variables, where the combined information exceeds the sum of individual contributions, while positive values indicate . This lack of non-negativity distinguishes it as a more nuanced measure of multi-way dependencies. Computing interaction information for n variables involves evaluating the inclusion-exclusion over all $2^n possible subsets, leading to exponential of O(2^n). This arises from the need to sum terms across every combination of variables, making exact calculation infeasible for large n without approximations.

Bounds and Inequalities

Interaction information satisfies several fundamental bounds and inequalities that constrain its possible values and behavior under transformations of the variables. The upper bound for the three-variable interaction information is given by the minimum of the pairwise mutual informations: I(X;Y;Z) \leq \min \left\{ I(X;Y), I(X;Z), I(Y;Z) \right\}. This follows directly from the non-negativity of conditional mutual information, as I(X;Y;Z) = I(X;Y) - I(X;Y|Z) and I(X;Y|Z) \geq 0, with symmetric expressions yielding the other pairwise terms. A corresponding lower bound arises from the alternative expressions for interaction information, which reveal its potential negativity: I(X;Y;Z) \geq -\min \left\{ I(X;Y|Z), I(X;Z|Y), I(Y;Z|X) \right\}. Here, the bound exploits the fact that I(X;Y;Z) = -I(X;Y|Z) + I(X;Y) \geq -I(X;Y|Z) (since I(X;Y) \geq 0), and similarly for the other conditional terms, taking the least restrictive (most negative) case. This inequality highlights how interaction information can be negative when conditioning reduces shared information more than expected under independence. An extension of the data processing inequality applies to interaction information, stating that it is non-increasing under local processing of any single variable. Specifically, if X' = f(X) for some function f, then I(X';Y;Z) \leq I(X;Y;Z), with equality if f is invertible. This property stems from the data processing inequality for mutual information, which bounds both the unconditional and conditional components in the definition of I(X;Y;Z). The result ensures that local computations or Markov mappings cannot amplify three-way interactions. Interaction information also relates to conditional forms when an additional independent variable is introduced. If W is of X, Y, Z, then I(X;Y;Z|W) = I(X;Y;Z). This follows because independence implies that all relevant conditional entropies equal the unconditional ones, preserving the interaction information value. For higher-order interaction information involving more than three variables, general bounds follow a similar structure, scaling with the minimum over relevant pairwise or lower-order mutual informations. The upper bound is typically I(X_1;\dots;X_n) \leq \min_{i<j} I(X_i;X_j), reflecting the constraint from non-negativity at lower orders, while the lower bound extends analogously to negative values bounded by conditional terms. These generalizations preserve the core inequalities from the three-variable case but grow more complex with dimensionality.

Examples and Interpretation

Positive Interaction Information

Positive interaction information arises when the three-variable interaction measure exceeds zero, signifying that the third variable Z contributes redundant information to the mutual dependence between X and Y, such that conditioning on Z reduces or explains away their apparent dependence. This redundancy reflects situations where the joint distribution of X, Y, and Z reveals that Z accounts for the correlation observed between X and Y, often indicating that observing Z makes X and Y conditionally independent. In the context of the three-variable formula for , a positive value highlights this explanatory role of Z in reducing the shared information structure among the variables. A classic illustration of positive interaction information is a scenario with weather variables: let X denote rain occurrence, Y denote sky darkness, and Z denote cloud presence. Clouds (Z) often cause both (X) and (Y), leading to a high unconditional mutual information I(X;Y) due to their . However, conditioning on clouds renders rain and darkness conditionally independent, with I(X;Y|Z) low or zero. The resulting positive interaction information I(X;Y;Z) > 0 captures the redundant dependency, where clouds provide the unifying explanation for the observed correlation between rain and darkness. To illustrate the computation of positive interaction information, consider a simple for the weather analogy with variables (0 for absence/light, 1 for presence/dark/rain). Assume P(Z=1) = 0.5 (clouds present half the time). When Z=1, X=1 with probability 0.8 (rain likely) and Y=1 with probability 0.8 (dark likely), independently. When Z=0, both X=0 and Y=0 with probability 1 (clear conditions, no rain or darkness). The marginal probabilities are P(X=1) = 0.5 × 0.8 = 0.4 and similarly P(Y=1) = 0.4. The is h(p) = -p \log_2 p - (1-p) \log_2 (1-p), so h(0.4) \approx 0.971 bits and h(0.8) \approx 0.721 bits. The unconditional mutual information is: I(X;Y) = h(0.4) + h(0.4) - H(X,Y). The joint entropy H(X,Y) = H(Z) + H(X,Y|Z) = 1 + [0.5 \times (h(0.8) + h(0.8)) + 0.5 \times 0] = 1 + 0.5 \times 1.442 = 1.721 bits (since independent given Z, and deterministic when Z=0). Thus, I(X;Y) \approx 1.942 - 1.721 = 0.221 \text{ bits}. The conditional mutual information is I(X;Y|Z) = \sum_z P(Z=z) I(X;Y|Z=z). For Z=1, I(X;Y|Z=1)=0 (independent); for Z=0, I(X;Y|Z=0)=0 (deterministic). So I(X;Y|Z) = 0. The interaction information is: I(X;Y;Z) = I(X;Y) - I(X;Y|Z) \approx 0.221 - 0 = 0.221 > 0 \text{ bits}, confirming positive interaction information due to the redundant explanatory role of Z.

Negative Interaction Information

Negative interaction information arises when the interaction information I(X; Y; Z) takes a negative value, which occurs if the conditional mutual information I(X; Y \mid Z) is greater than the unconditional mutual information I(X; Y). This situation reflects synergy, where knowledge of Z enhances the shared information between X and Y, as Z reveals or amplifies their dependence. In such distributions, the joint dependency structure shows that X and Y are more informative about each other when conditioned on Z than marginally, indicating complementary or synergistic roles among the variables. For instance, consider a probability distribution where X and Y are independent without Z (so I(X; Y) = 0), but become dependent given Z = 1 with I(X; Y \mid Z=1) > 0; the resulting I(X; Y; Z) = I(X; Y) - I(X; Y \mid Z) < 0 quantifies this synergy. A classic illustration of negative interaction information is the XOR logic gate, where the output Z depends synergistically on the binary inputs X and Y such that Z = 1 if and only if exactly one of X or Y is 1. Assuming X and Y are independent and uniformly distributed (each taking value 1 with probability 0.5), the mutual information I(X;Y)=0 bits. However, conditioned on Z, X and Y become dependent: given Z=0, X=Y; given Z=1, X≠Y. The conditional mutual information I(X;Y|Z)=1 bit, resulting in I(X;Y;Z) = 0 - 1 = -1 bit < 0. This negative value underscores the synergy, as neither X nor Y alone predicts Z, but their combination does. Another classic example is the car failure scenario. Here, let X represent a dead battery, Y a blocked fuel pump, and Z the event that the car fails to start. Unconditionally, battery failure and fuel pump blockage are independent causes, yielding I(X; Y) = 0. However, conditioned on the car not starting (Z = 1), X and Y exhibit dependence: if the battery is functional, the fuel pump must be blocked (and vice versa), so I(X; Y \mid Z=1) > 0. This leads to negative I(X; Y; Z), capturing the synergistic explanatory power of the common effect Z. In biological systems, positive interaction information highlights , such as in neural processing where multiple regions provide overlapping about a stimulus. For example, in tasks, the left posterior exhibits positive interaction information with other areas, indicating redundant contributions to auditory processing that enhance robustness without adding unique synergistic effects. This can be analogous to scenarios in , where two drugs X and Y with overlapping mechanisms increase the joint about a therapeutic outcome Z, as knowledge of one drug's effect complements the other in a redundant manner.

Challenges

Interpretation of Negative Values

Negative values of interaction information present a conceptual challenge because, unlike which is always non-negative, interaction information can be negative, signaling among variables while indicating strong three-way dependencies that confound simple pairwise interpretations. This negativity arises when the conditional between two variables exceeds their unconditional , suggesting that knowledge of the third variable enhances rather than diminishes the shared information between the pair. Such outcomes highlight the measure's sensitivity to complex multivariate structures, where negative values do not imply absence of dependence but rather a nuanced interplay that traditional non-negative measures like cannot capture. In the of probability distributions, negative interaction information geometrically corresponds to scenarios where on the third Z enhances the correlations between X and Y, increasing their by revealing complementary dependencies or synergistic paths. This interpretation aligns with probabilistic models where Z acts in a way that amplifies pairwise associations conditionally, mapping to regions in the probability where the three-way joint distribution exhibits facilitative effects on pairwise associations. A common misconception is that negative interaction information denotes "anti-synergy" or oppositional effects among variables; instead, it reflects situations where synergy outweighs redundancy in partial information decompositions, as the negativity can confound pure redundancy with synergistic contributions. This distinction is crucial, requiring careful decomposition to disentangle the components.

Implications of Zero Interaction

Zero interaction information, denoted as I(X;Y;Z) = 0, indicates the absence of a three-way interaction among the random variables X, Y, and Z. This condition holds when the mutual information between any pair of variables equals the corresponding conditional mutual information given the third variable, meaning that observing the third variable provides no additional insight into the dependency between the pair. For instance, I(X;Y) = I(X;Y \mid Z), implying that the information shared between X and Y remains unaltered by knowledge of Z. A key implication of zero interaction information is that it does not necessarily imply conditional independence between any pair of variables given the third. Conditional independence X \perp Y \mid Z requires I(X;Y \mid Z) = 0, but zero interaction allows for I(X;Y \mid Z) > 0 as long as it equals I(X;Y) > 0, preserving pairwise dependencies unaffected by the conditioner. Consider a scenario where X and Y exhibit mutual information due to a direct relationship, while Z is independent of both; here, I(X;Y;Z) = 0 despite the evident dependency between X and Y, and no conditional independence holds given Z. This highlights a limitation: zero interaction merely signals balanced or absent three-way effects, not the elimination of all dependencies. Furthermore, zero interaction information does not preclude the presence of higher-order dependencies among the variables. In structures involving more than three variables, pairwise and three-way interactions may cancel or be absent while four-way or higher correlations persist, masking complex interdependencies. A recent in reinforces this pitfall, showing that zero values in three-party measures like interaction information can obscure genuine higher-order quantum correlations in entangled states, where low-order balances fail to capture multipartite entanglement effects.

Applications

Machine Learning and Feature Selection

Interaction information plays a key role in within by quantifying three-way dependencies among attributes, enabling the identification of synergistic or redundant features that pairwise might overlook. Introduced by Jakulin and Bratko in , this measure, defined as I(X; Y; Z) = I(X; Y) - I(X; Y | Z), helps detect interactions where features provide complementary information (negative values indicating ) or overlap excessively (positive values indicating ). In practice, it supports filter-based methods to rank feature subsets, improving model performance on datasets with complex attribute relationships, such as the MONK1 where it highlights critical three-way interactions. Extensions of this approach have integrated interaction information into multi-label settings, where it evaluates label-feature dependencies to select informative while mitigating . For instance, a 2022 review highlights its use in greedy algorithms that iteratively add features based on interaction gains, outperforming univariate selectors on high-dimensional data. Another 2022 method employs conditional variants of interaction information for multi-label , demonstrating reduced computational overhead compared to exhaustive searches. In algorithmic applications, interaction information aids construction and wrapper methods by collinear or conditionally independent features, as negative values signal synergy akin to those discussed in interpretive challenges. This enhances tree interpretability and generalization; for example, in attribute interaction dendrograms, features with zero or low interaction scores are deprioritized, reducing in datasets like UCI repositories. A typical in involves computing I(X; Y; Z) for candidate feature triplets against sets, them by absolute interaction strength, and selecting top subsets for training classifiers like random forests. This process, applied to text categorization tasks, has shown gains in F1-score over baseline filters by capturing higher-order synergies. Recent developments incorporate interaction information into frameworks to build interaction-aware neural networks, particularly for in bioinformatics. A 2024 bioRxiv extends it via visible neural networks to detect genetic interactions, enabling scalable feature ranking in genomic datasets with non-linear dependencies. This integration allows models to weigh three-way terms during , improving prediction in multi-omics tasks without explicit enumeration of all interactions.

Biological and Causal Systems

In , interaction information serves as a tool for detecting gene-environment interactions that contribute to complex phenotypes, particularly quantitative traits. Early work highlighted the importance of such interactions in disease susceptibility, with methods like the genetic interaction tree proposed to identify non-linear effects in families. This approach was extended using information-theoretic measures, such as the k-way interaction information (KWII), to quantify associations between genetic variants, environmental factors, and quantitative traits like or levels. For instance, the applies KWII to genome-wide , prioritizing informative gene-environment pairs by measuring the information shared beyond pairwise mutual informations, demonstrating higher power in simulations compared to traditional models. Recent studies on quantitative traits, including those leveraging entropy-based metrics, have further refined these techniques to handle high-dimensional and multiple environmental exposures, revealing interactions that explain additional trait variance in cohorts like the . In , interaction information provides a basis for testing three-way causal relationships, particularly in structures involving directed cycles. A key application involves assessing whether three variables form a causal triangle, where positive interaction information indicates consistent with certain causal directions, while negative values suggest incompatible with direct causation. This method, applied to simulated and real datasets, outperforms pairwise by capturing multi-variable dependencies without assuming acyclicity. In , differences in mutual informations—equivalent to interaction information—enable the of synergistic or redundant effects in biological networks, such as signaling cascades in cancer. For example, in gene regulatory pathways, non-zero interaction information between a , an environmental cue, and a downstream effector highlights causal , aiding in the prioritization of therapeutic targets in 2022 analyses of proteomic data. Molecular simulations employ interaction information to quantify protein-ligand interactions and allosteric mechanisms. The N-body Information Theory (NbIT) framework uses three-way interaction information to detect coordinated residue motions induced by ligand binding, revealing functional hotspots in transporters like the leucine transporter LeuT. In simulations of G-protein-coupled receptors, negative interaction information identifies synergistic dynamics suppressed by ligands, while positive values pinpoint redundant allosteric sites. Recent entropy-based reviews emphasize these applications in dissecting binding affinities and conformational changes, integrating interaction information with trajectories to model entropy flows in protein-ligand complexes without relying on predefined pathways. In , the interaction information I(\text{[gene](/page/Gene)}; \text{[environment](/page/Environment)}; \text{[disease](/page/Disease)}) identifies synergistic risks, such as when a genetic variant and environmental exposure jointly elevate odds beyond additive effects. For example, positive values in studies of folate metabolism genes, , and indicate , guiding risk stratification in population cohorts.

Quantum and Higher-Order Extensions

Interaction information has been generalized to quantum systems by replacing Shannon entropy with von Neumann entropy, enabling the quantification of multipartite correlations in quantum states. This extension defines a family of quantum mutual information measures that capture interdependencies among multiple quantum subsystems, generalizing conditional mutual information and interaction information to higher parties. The approach provides insights into classical, quantum, and total correlations, with properties such as monotonicity under local operations, and addresses open questions in quantum information theory. For systems involving more than three variables (n > 3), interaction information extends through k-way measures defined recursively via inversion on the lattice of subsets, allowing the decomposition of multivariate into partial association components. These partial association measures, such as adjusted for higher-order redundancies, facilitate the detection of synergistic or redundant dependencies in complex datasets. In applications to complex systems, such as genomic variable selection, these extensions enable the identification of higher-order interactions that pairwise measures overlook, improving model performance in high-dimensional settings like gene-gene analysis. Recent developments include applications in , where interaction information quantifies the influence of large-scale cosmic environments on properties, revealing synergies or redundancies between local density, , and color; this , initially proposed in 2017, has been extended in subsequent analyses. Additionally, the InfoTopo software package, released in , implements topological information analysis for estimating higher-order dependencies as simplicial complexes, supporting entropy-based decompositions for ; updates through 2024 enhance its scalability for multiparty interactions in neural and ecological data. A key challenge in these extensions is the computational explosion for n > 4, where exact computation of higher-order terms requires evaluating 2^n entropies, leading to exponential scaling that renders full decompositions infeasible for large systems. Approximations via sampling methods, such as permutation-free testing or neural estimators, mitigate this by generating empirical distributions for interaction terms, though they introduce biases in sparse regimes.

References

  1. [1]
    [PDF] Interaction Information for Causal Inference: The Case of Directed ...
    Jan 30, 2017 · That is, interaction information is the difference between the conditional and unconditional mutual information between two of the variables, ...
  2. [2]
    [PDF] arXiv:1004.2515v1 [cs.IT] 14 Apr 2010
    Apr 14, 2010 · Our analysis also demonstrates how the negativity of interaction information can be explained by its confounding of redundancy and synergy. PACS ...
  3. [3]
    A Tutorial for Information Theory in Neuroscience | eNeuro
    Jun 29, 2018 · In this tutorial, we provide a thorough introduction to information theory and how it can be applied to data gathered from the brain.
  4. [4]
    [PDF] Information of Interactions in Complex Systems
    Jul 27, 2009 · I am responding to Leydesdorff's (2009) 'Interaction information: linear and nonlinear interpretations', published in the current issue of ...
  5. [5]
    Multivariate information transmission | Psychometrika
    Cite this article. McGill, W.J. Multivariate information transmission. Psychometrika 19, 97–116 (1954). https://doi.org/10.1007/BF02289159. Download citation.
  6. [6]
    [PDF] A Mathematical Theory of Communication
    In the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible ...Missing: mutual | Show results with:mutual
  7. [7]
    [PDF] Joint Entropy, Equivocation and Mutual Information
    I(X;Y) information received at Y (encrypted and unencrypted). I(X;Z|Y) information received just at Z (encrypted). I(X;Z) information received at Z ...
  8. [8]
    Multivariate Information Transmission | Psychometrika
    Jan 1, 2025 · Multivariate Information Transmission. Published online by Cambridge University Press: 01 January 2025. William J. McGill.
  9. [9]
    Information-theoretic sensitivity analysis: a general method for credit ...
    ... interaction information'. McGill also showed that interaction information is symmetric with respect to permutations of its arguments, meaning for the ...
  10. [10]
    Understanding Interdependency Through Complex Information ...
    This, in turn, also shows that, for the particular case of Gaussians variables, forming a Markov chain is a necessary and sufficient condition for having zero ...
  11. [11]
    INTERACTION INFORMATION IN MULTIVARIATE PROBABILITY ...
    ... McGill [3], the interaction information between the three variables. It is the gain or loss in transmitted information between any two of the variables, due ...
  12. [12]
  13. [13]
    [PDF] Information Theory - Weizmann Institute of Science
    What sign is the interaction information between i) clouds, rain, and darkness, ... Clouds cause rain and also block the sun; therefore, the correlation be-.
  14. [14]
  15. [15]
    [PDF] arXiv:cs.AI/0308002 v3 2 Mar 2004
    In this section, we will revise the basic concepts of information theory in order to derive interaction information, our proposal for quantifying higher-order ...
  16. [16]
    [PDF] arXiv:2310.03681v1 [quant-ph] 5 Oct 2023
    Oct 5, 2023 · Eventually, the O-information is zero if redundancies and synergies are balanced. For n = 3, Ω coincides with the interaction information [28], ...<|control11|><|separator|>
  17. [17]
    [cs/0308002] Quantifying and Visualizing Attribute Interactions - arXiv
    Aug 1, 2003 · We have chosen McGill's interaction information, which has been independently rediscovered a number of times under various names in various ...
  18. [18]
    [PDF] Searching for Interacting Features - IJCAI
    This is studied in [Jakulin and Bratko,. 2003] as attribute interaction. For example, MONK1 is a data set involving feature interaction. There are six features ...
  19. [19]
    Information Theoretic Methods for Variable Selection—A Review
    The 3-way interaction information I I ( Y ; X , Z ) of, possibly multivariate, X , Y , and Z plays also important role in feature selection. Definition 3. The 3 ...2.3. Interaction Information... · 3. Feature Selection · 3.2. Greedy Feature...
  20. [20]
    Multi‐Label Feature Selection with Conditional Mutual Information
    Oct 8, 2022 · https://doi.org/10.1155/2022/9243893 ... -W., Mutual Information-based multi-label feature selection using interaction information, Expert Systems ...
  21. [21]
    [PDF] Testing the Significance of Attribute Interactions
    Interaction information can be seen as a generaliza- ... The impossibility of a particular value conjunction is often inferred from the zero count in the set of ...
  22. [22]
    Mutual Information-based multi-label feature selection using ...
    In this paper, we presented a k-degree interaction-based feature selection method for multi-label classification. To efficiently evaluate the dependency of ...
  23. [23]
    A novel method to identify gene-gene effects in nuclear families
    It is now well recognized that gene-gene and gene-environment interactions are important in complex diseases, and statistical methods to detect interactions ...
  24. [24]
    AMBIENCE: A Novel Approach and Efficient Algorithm for Identifying ...
    ... interaction information (KWII) and the total correlation information (TCI) ... Kang and J. H. Moore, 2006. A novel method to identify gene-gene effects ...
  25. [25]
    Information-theoretic gene-gene and gene-environment interaction ...
    Nov 4, 2009 · This report proposes an information-theoretic approach for identifying associations of GEI and GGI with a QT.<|control11|><|separator|>
  26. [26]
    NbIT - A New Information Theory-Based Analysis of Allosteric ...
    Citation: LeVine MV, Weinstein H (2014) NbIT - A New Information Theory-Based Analysis of Allosteric Mechanisms Reveals Residues that Underlie Function in ...
  27. [27]
    Family of Quantum Mutual Information in Multiparty Quantum Systems
    Jul 23, 2024 · This paper presents the concept of generalized conditional mutual information, along with a family of multiparty quantum mutual information measures.Missing: interaction | Show results with:interaction
  28. [28]
    How much a galaxy knows about its large-scale environment?
    A positive interaction information imply 'synergy' and a negative interaction information imply 'redundancy' in the interaction between the random variables ( ...
  29. [29]
    what does anisotropy in a stellar halo tell us? - IOPscience
    Oct 18, 2022 · Our analysis shows that the whole-sky anisotropy in each stellar halo increases with the distance from its centre and eventually plateaus out ...
  30. [30]