Communication theory is a field of information and mathematics that studies the technical process of information and the human process of human communication. According to communication theorist Robert T. Craig in his essay 'Communication Theory as a Field' (1999), "despite the ancient roots and growing profusion of theories about communication," there is not a field of study that can be identified as 'communication theory'.
Origins
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Claude Shannon (1916-2001)
The origins of communication theory is linked to the development of information theory in the early 1920s. Limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.
Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system.
Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information.
Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.
The main landmark event that opened the way to the development of communication theory was the publication of an article by Claude Shannon in the Bell System Technical Journal in July and October 1948 under the title 'A Mathematical Theory of Communication.' Shannon focused on the problem of how best to encode the information that a sender wants to transmit. He used also tools in probability theory, developed by Norbert Wiener. They marked the nascent stages of applied communication theory at that time. Shannon developed information entropy as a measure for the uncertainty in a message while essentially inventing the field of information theory.
In 1949, a declassified version of his wartime work on the mathematical theory of cryptography ('Communication Theory of Secrecy Systems',) he proved that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. He is also credited with the introduction of sampling theory, which is concerned with representing a continuous-time signal from a (uniform) discrete set of samples. This theory was essential in enabling telecommunications to move from analog to digital transmissions systems in the 1960s and later.
In 1951, Shannon made his fundamental contribution to natural language processing and computational linguistics with his article 'Prediction and Entropy of Printed English' (1951), providing a clear quantifiable link between cultural practice and probabilistic cognition.
No hay comentarios:
Publicar un comentario