Recent results in information theory
- 1 June 1966
- journal article
- Published by Cambridge University Press (CUP) in Journal of Applied Probability
- Vol. 3 (1), 1-93
- https://doi.org/10.2307/3212039
Abstract
Information theory, in the strict sense, is a rapidly developing branch of probability theory originating from a paper by Claude E. Shannon in theBell System Technical Journalin 1948,in which anew mathematical model ofcommunications systems was proposed and investigated.One of the central innovations of this model was in regarding the prime components of a communications system (the source of messages and the communication channel) as probabilistic entities. Shannon also proposed a quantitative measure of the amount of information based on his notion of entropy and proved the basic theorem of this theory concerning the possi bility of reliable transmission of information over a particular class of noisy channels.Keywords
This publication has 102 references indexed in Scilit:
- Über Mittelwerte und Entropien Vollständiger WahrscheinlichkeitsverteilungenActa Mathematica Hungarica, 1964
- Charakterisierung der Entropien positiver Ordnung und der shannonschen EntropieActa Mathematica Hungarica, 1963
- ber die Struktur der mittleren EntropieMathematische Zeitschrift, 1962
- Über Kanäle vom DichtetypusMathematische Zeitschrift, 1962
- A note on generalizations of Shannon-McMillan theoremPacific Journal of Mathematics, 1961
- Generalizations of Shannon-McMillan theoremPacific Journal of Mathematics, 1961
- Die Übertragung diskreter Informationen durch periodische und fastperiodische KanäleMathematische Annalen, 1959
- On the dimension and entropy of probability distributionsActa Mathematica Hungarica, 1959
- Equivalence and perpendicularity of Gaussian processesPacific Journal of Mathematics, 1958
- The Basic Theorems of Information TheoryThe Annals of Mathematical Statistics, 1953