Fact-checked by Grok 2 weeks ago
References
-
[1]
[PDF] A Mathematical Theory of CommunicationIn the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible ...
-
[2]
Claude E. Shannon | IEEE Information Theory SocietyThe American mathematician and computer scientist who conceived and laid the foundations for information theory.
-
[3]
[PDF] INTRODUCTION TO INFORMATION THEORY - Stanford UniversityThis chapter introduces some of the basic concepts of information theory, as well as the definitions and notations of probabilities that will be used throughout.
-
[4]
Claude E. Shannon: Founder of Information TheoryOct 14, 2002 · In a landmark paper written at Bell Labs in 1948, Shannon defined in mathematical terms what information is and how it can be transmitted in the face of noise.<|control11|><|separator|>
-
[5]
Information Theory - Bits and Binary DigitsThe simplest of the many definitions of information in Shannon's theory is that information is a decrease in uncertainty. To illustrate a more concrete ...Missing: flip | Show results with:flip
-
[6]
Information Theory in Computational Biology: Where We Stand TodayIn this article we review the basic information theory based concepts and describe their key applications in multiple major areas of research in computational ...Missing: interdisciplinary | Show results with:interdisciplinary
-
[7]
Claude Shannon: Biologist: The Founder of Information Theory ... - NIHClaude Shannon founded information theory in the 1940s. The theory has long been known to be closely related to thermodynamics and physics.
-
[8]
[PDF] Nyquist 1924 - Certain Factors Affecting Telegraph Speed - MonoskopSYNOPSIS: This paper considers two fundamental factors entering into the maximum speed of "transmission of intelligence by telegraph. These factors are signal ...Missing: sampling | Show results with:sampling<|separator|>
-
[9]
[PDF] Transmission of Information¹ - By RVL HARTLEY - MonoskopHow the rate of transmission of this information over a system is limited by the distortion resulting from storage of energy is discussed from the transient.
-
[10]
[PDF] The Mathematical Theory of Communication - MPG.PuReThe mathematical theory of the engineering aspects of com- munication, as developed chiefly by Claude Shannon at the Bell. Telephone Laboratories, admittedly ...
-
[11]
[PDF] A PRELIMINARY REPORT ON A GENERAL THEORY OF ...A GENERAL THEORY. OF INDUCTIVE INFERENCE. R. J. Solomonoff. Abstract. Some preliminary work is presented on a very general new theory of inductive inference.
-
[12]
[PDF] Three approaches to the quantitative definition of informationDec 21, 2010 · There are two common approaches to the quantitative definition of. "information": combinatorial and probabilistic. The author briefly de-.
-
[13]
A. S. Holevo, “Bounds for the Quantity of Information Transmitted by ...Abstract: Certain bounds are derived for the quantity of information transmitted by a quantum channel. It is proved that if at least two of the set of ...
-
[14]
[PDF] quantum-computation-and-quantum-information-nielsen-chuang.pdfThis comprehensive textbook describes such remarkable effects as fast quantum algorithms, quantum teleportation, quantum cryptography, and quantum error- ...
-
[15]
The Information Bottleneck Method - Google ResearchNaftali Z. Tishby. Fernando Pereira. William Bialek. Proceedings of the 37th Allerton Conference on Communication, Control and Computing, Urbana, Illinois (1999).
-
[16]
[PDF] arXiv:2407.12288v4 [stat.ML] 21 May 2025May 21, 2025 · Concretely, we provide a theoretical framework rooted in Bayesian statistics and Shannon's information theory which is general enough to unify ...
-
[17]
Elements of Information Theory | Wiley Online BooksOct 5, 2001 · THOMAS M. COVER is Professor jointly in the Departments of Electrical Engineering and Statistics at Stanford University.
-
[18]
Elements of Information Theory | Semantic ScholarSep 16, 2005 · Elements of Information Theory · T. Cover, Joy A. Thomas · Published 16 September 2005 · Mathematics.
-
[19]
On Information and Sufficiency - jstorThis is a journal article titled 'On Information and Sufficiency' by S. Kullback and R. A. Leibler, published in The Annals of Mathematical Statistics.
-
[20]
Information Theory And An Extension Of The Maximum Likelihood ...The Akaike Information Criterion, 'AIC', (Akaike, 1973) is a solution to the issue of selecting variables to include in a multiple regression model.
-
[21]
[PDF] A Method for the Construction of Minimum-Redundancy Codes*Minimum-Redundancy Codes*. DAVID A. HUFFMAN+, ASSOCIATE, IRE. September. Page 2. 1952. Huffman: A Method for the Construction of Minimum-Redundancy Codes. 1099 ...
-
[22]
[PDF] ARITHMETIC CODING FOR DATA COIUPRESSIONAuthors' Present Address: Ian H. Witten, Radford M. Neal, and John. G. Cleary, Dept. of Computer Science, The University of Calgary,. 2500 University ...
-
[23]
[PDF] Prediction and Entropy of Printed English - Princeton UniversityEntropy measures information per letter, while redundancy measures constraints. Prediction is based on how well the next letter can be predicted given ...
-
[24]
[PDF] The Transmission of InformationJul 6, 2014 · The main purpose of this paper was to provide a logical basis for the measurement of the rate of transmission of information. It has been shown.Missing: 1952 | Show results with:1952
-
[25]
[PDF] A Technique for High-Performance Data CompressionFigure 1. A model for a compression system that per images, especially the line drawings of business graphics, forms transparent compression.
-
[26]
[PDF] Compression of Individual Sequences via Variable-Rate CodingCompression of Individual Sequences via. Variable-Rate Coding. JACOB ZIV, FELLOW, IEEE, AND ABRAHAM LEMPEL, MEMBER, IEEE.
-
[27]
[PDF] Data Compression Using Adaptive Coding and Partial String MatchingThis paper describes how the conflict can be resolved with partial string matching, and reports experimental results which show that mixed-case English text can ...
-
[28]
[PDF] The JPEG Still Picture Compression StandardDCT coefficle”t The p“rpose of quantlzatlon IS to acbteve furtber compression by representing DCT coefficients wltb no greater precl- slon than IS“ecessary ...
-
[29]
[PDF] Shannon's Noisy Coding Theorem 16.1 Defining a ChannelOur goal is to understand what kind of encodings are such that c. W is the same as W with high probability and n is as small as possible. The following notation ...Missing: sketch | Show results with:sketch
-
[30]
[PDF] Reed-Muller Codes Achieve Capacity on Erasure Channels - arXivJun 15, 2015 · Abstract—This paper introduces a new approach to proving that a sequence of deterministic linear codes achieves capacity on an erasure ...
-
[31]
[PDF] Codes for the Z-channel - arXivJul 2, 2023 · Sec. VIII contains discussion on the capacity of Z-channels with stochastic errors. This is a direct consequence of the seminal channel coding ...
-
[32]
[PDF] Capacity of Multi-antenna Gaussian Channels - MITWe investigate the use of multiple transmitting and/or receiving antennas for single user communications over the additive Gaussian channel with and.
-
[33]
[PDF] arXiv:math/0207197v1 [math.CO] 22 Jul 2002This paper gives a brief survey of binary single-deletion-correcting codes. The Varshamov-Tenengolts codes appear to be optimal, but many interesting unsolved ...
-
[34]
[PDF] Information Theory in Modem Practice - EPFLExperience has shown that it is always risky to speak of “conclusions” when it comes to modems; we will however try in the end to at least sum up what we have.
-
[35]
[PDF] TR 138 901 - V14.3.0 - 5G; Study on channel model for ... - ETSIThe present document may refer to technical specifications or reports using their 3GPP identities, UMTS identities or. GSM identities. These should be ...
-
[36]
[PDF] Causality, feedback and directed information.A definition, closely based on an old idea of Marko, is given for the directed information flowing from one sequence to another. This directed information is ...Missing: seminal | Show results with:seminal
- [37]
-
[38]
[PDF] Feedback Capacity of Stationary Gaussian Channels - arXivIn particular, this result shows that the celebrated Schalkwijk–Kailath coding scheme achieves the feedback capacity for the first-order autoregressive ...
-
[39]
[PDF] Directed Information for Channels with FeedbackThe capacity regions of channels with feedback are investigated. The corresponding information rates are simpli ed by using the conditional independence of ...
-
[40]
NoneSummary of each segment:
-
[41]
Communication theory of secrecy systems - IEEE XploreIn this paper a theory of secrecy systems is developed. The approach is on a theoretical level and is intended to complement the treatment found in standard ...
-
[42]
[PDF] The Wire-Tap ChannelCopyright 1975, American Telephone and Telegraph Company. Printed in U.S.A.. The Wire-Tap Channel. By A. D. WYNER. (Manuscript received May 9, 1975).
-
[43]
[PDF] Secret key agreement by public discussion from common informationInformation-theoretic or unconditional security is more de- sirable in ... MAURER: SECRET KEY AGREEMENT BY PUBLIC DISCUSSION FROM COMMON INFORMATION.
-
[44]
[PDF] How much randomness can be extracted from memoryless Shannon ...Some authors use a heuristic estimate obtained from the Asymptotic. Equipartition Property, which yields roughly n extractable bits, where n is the total ...
-
[45]
[PDF] An Information-Theoretic Model for Steganography - cachin.comMar 4, 2004 · Abstract. An information-theoretic model for steganography with a passive adversary is proposed. The adversary's task of distinguishing ...
-
[46]
[PDF] Quantum cryptography: Public key distribution and coin tossingWhen elementary quantum systems, such as polarized photons, are used to transmit digital information, the un- certainty principle gives rise to novel ...
-
[47]
Using Information Theory Approach to Randomness Testing - arXivApr 3, 2005 · Known statistical tests and suggested ones are applied for testing PRNGs. Those experiments show that the power of the new tests is greater than ...Missing: theoretic | Show results with:theoretic
-
[48]
[physics/0004057] The information bottleneck method - arXivApr 24, 2000 · We squeeze the information that X provides about Y through a bottleneck formed by a limited set of codewords tX.
-
[49]
Deep Learning and the Information Bottleneck Principle - arXivMar 9, 2015 · Deep Neural Networks (DNNs) are analyzed via the theoretical framework of the information bottleneck (IB) principle.
-
[50]
[1606.03657] InfoGAN: Interpretable Representation Learning by ...Jun 12, 2016 · This paper describes InfoGAN, an information-theoretic extension to the Generative Adversarial Network that is able to learn disentangled representations.
-
[51]
[1312.7381] Rate-Distortion Auto-Encoders - arXivDec 28, 2013 · We propose a learning algorithm for auto-encoders based on a rate-distortion objective that minimizes the mutual information between the inputs and the outputs.Missing: theory | Show results with:theory
-
[52]
[PDF] How Many Clusters? An Information-Theoretic PerspectiveIn a statistical mechanics approach, clustering can be seen as a trade-off between energy- and entropy-like terms, with lower temperature driving the ...
-
[53]
Fast Binary Feature Selection with Conditional Mutual InformationWe propose in this paper a very fast feature selection technique based on conditional mutual information. By picking features which maximize their mutual ...
-
[54]
[2302.03792] Information-Theoretic Diffusion - arXivFeb 7, 2023 · We introduce a new mathematical foundation for diffusion models inspired by classic results in information theory that connect Information with Minimum Mean ...