Abstract
Information theory, in the strict sense, is a rapidly developing branch of probability theory originating from a paper by Claude E. Shannon in theBell System Technical Journalin 1948,in which anew mathematical model ofcommunications systems was proposed and investigated.One of the central innovations of this model was in regarding the prime components of a communications system (the source of messages and the communication channel) as probabilistic entities. Shannon also proposed a quantitative measure of the amount of information based on his notion of entropy and proved the basic theorem of this theory concerning the possi bility of reliable transmission of information over a particular class of noisy channels.

This publication has 102 references indexed in Scilit: