Current location - Education and Training Encyclopedia - Graduation thesis - 1949 who published the communication theory of security system?
1949 who published the communication theory of security system?
Shannon published Communication Theory of Security System from 65438 to 0949.

Claude elwood shannon (1965438+April 30, 2006-February 24, 2006, 5438+0) was an American mathematician and the founder of information theory. 1936 received a bachelor's degree from the University of Michigan. 1940 received master's and doctor's degrees from Massachusetts Institute of Technology, 194 1 year worked in Bell Laboratories.

Shannon put forward the concept of information entropy, which laid the foundation of information theory and digital communication. The main thesis is the master thesis of 1938, the symbolic analysis of relay and switch circuit, the mathematical principle of communication of 1948, and the communication under noise of 1949. Shannon Award in memory of Claude elwood Shannon is the highest award in the field of communication theory, also known as the "Nobel Prize in the field of information".

The achievements of Claude elwood Shannon.

The important feature of Shannon's theory is the concept of entropy, which proves that entropy is equivalent to the uncertainty of information quantity. Entropy was once introduced by Boltzmann in the second law of thermodynamics, which we can understand as the disorder degree of molecular motion. Shannon extended the concept of entropy in statistical physics to the process of channel communication, thus establishing the subject of "information theory".

Shannon's definition of "entropy" is also called "Shannon entropy" or "information entropy", that is, I mark all possible samples in the probability space, indicating the occurrence probability of samples, and k is an arbitrary constant related to unit selection. It can be clearly seen that the definitions of "information entropy" and "thermodynamic entropy" (Boltzmann formula) differ only by a certain proportional constant.

Entropy refers to the degree of system chaos, which has important applications in cybernetics, probability theory, number theory, astrophysics, life science and other fields. It has a more specific definition in different disciplines and is a very important parameter in various fields. Entropy was put forward by rudolf clausius and applied to thermodynamics. Claude elwood shannon first introduced the concept of entropy into information theory.