Claude shannon (19 16-200 1) was born on 19 16 in Gaylord, Michigan, USA. At that time, the town had only 3,000 residents. Shannon's father is a judge in this town, and his mother is a middle school principal in this town. He grew up in a well-educated environment, but his parents didn't seem to have as much influence on his science as his grandfather. Shannon's grandfather was a farmer and inventor. He invented washing machines and many agricultural machinery, which had a direct impact on Shannon. In addition, the Shannon family is distantly related to the great inventor Thomas Alva Edison (1847- 193 1).
Shannon's two contributions: first, the concepts of information theory and information entropy; The other is symbolic logic and switch theory. Shannon's information theory has made a decisive contribution to defining the concept of information quantity.
1936 Shannon obtained a bachelor's degree in mathematics and electrical engineering from the University of Michigan, and then entered the Massachusetts Institute of Technology for postgraduate study. Shannon received a master's degree in electrical engineering from MIT on 1938. The topic of the master's thesis is symbolic analysis of relays and switching circuits. At that time, he had noticed the similarity between telephone switching circuit and Boolean algebra, that is, the "true" and "false" of Boolean algebra correspond to the "on" and "off" of the circuit system, which are all represented by 1 and 0. So he used Boolean algebra to analyze and optimize the switching circuit, which laid the theoretical foundation of digital circuit. Harvard University
Professor howard gardner said, "This is probably the most important and famous master's thesis in this century.
1940, Shannon received a Ph.D. degree in mathematics from MIT, but his doctoral thesis was about human genetics, entitled Algebra of Theoretical Genetics. It shows that Shannon has a wide range of interests, and later published many influential articles in different disciplines. While studying for his degree, he also spent part of his time studying differential analyzer with Professor Winnipeg Bush. This analyzer is an early mechanical simulation computer used to obtain numerical solutions of ordinary differential equations. 194 1 year, Shannon published Mathematical Theory of Differential Analyzer, and he wrote: "Most results are given in the form of proving theorems. The most important thing is to deal with some conditions, some conditions can generate functions of one or more variables, and some conditions can make ordinary differential equations get solutions. Some precautions are also given, and the approximate value of the function, the approximate value of the adjustment rate and the method of automatic control rate are given. "
194 1, Shannon is in & entering; T bell telephone company, and worked in bell laboratory until 1972, aged 24 to 55, 3 1 year. Shannon cooperated with John Riordan to publish a paper on the number of double terminals in series-parallel network in 1942. This paper generalizes the theory of McMahon's paper published in Electrician 1892. Shannon published Mathematical Theory of Communication in 1948, and founded information theory.
During World War II, Dr. Shannon was also a famous code breaker (which is reminiscent of Dr. Turing, who is four years older than him). His deciphering team at Bell mainly tracked German planes and rockets, especially when German rockets blitzed Britain. From 65438 to 0949, Shannon published another important paper, Communication Theory of Security System. Based on this work practice, its significance lies in turning safe communication from art to science.
The concept of entropy
The important feature of Shannon's theory is the concept of entropy, which proves that entropy is equivalent to the uncertainty of information quantity. Entropy was once introduced by Boltzmann in the second law of thermodynamics, which we can understand as the disorder degree of molecular motion. Information entropy has a similar meaning. For example, in Chinese information processing, the static average information entropy of Chinese characters is relatively large, which is 9.65 bits in Chinese and 4.03 bits in English. This shows that Chinese is more complex than English, reflecting that Chinese is rich in meaning and concise in writing, but it is also difficult to deal with. A large information entropy means a large uncertainty. Therefore, we should study deeply and seek a deep breakthrough in Chinese information processing. We can't blindly think that Chinese characters are the most beautiful characters in the world, and thus come to the wrong conclusion that Chinese characters are the easiest to handle.
As we all know, quality, energy and information are three very important quantities. It has long been known to measure the mass of matter with scales or balances, but the relationship between heat and work became clear in the middle of19th century with the definition of mechanical equivalent of heat and the establishment of the law of conservation of energy. The word energy is their collective name, and the measurement of energy is solved by the emergence of new units such as calories and joules. However, the knowledge about words, numbers, images and sounds has a history of thousands of years. But what are their common names and how to measure them uniformly were not put forward correctly until the end of 19, let alone how to solve them.
At the beginning of the 20th century, with the development of telegraph, telephone, photo, television, radio and radar, the problem of how to measure the information in the signal was vaguely put on the agenda. In 1928, R.V. H. Harley considered the problem of taking out n symbols from d different symbols to form a word. If each symbol has the same probability and is completely randomly selected, DN different words can be obtained. Take a concrete word from these words, which corresponds to an amount of information I. Hartley suggested that the amount of information should be expressed by the amount of N log D, that is, I = I=N log D, where log represents the logarithm with the base of 10. Later, Weiner, the founder of 1949 cybernetics, also studied the problem of measurement information and led it to the second law of thermodynamics. But Shannon is the core figure who gives the basic mathematical model of information transmission. Shannon's paper "Mathematical Theory of Communication", which lasted for dozens of pages from 65438 to 0948, became a milestone in the formal birth of information theory. In his mathematical model of communication, he clearly put forward the problem of information measurement. He extended Hartley formula to the case of different probability pi, and obtained the famous formula for calculating information entropy h:
H=∑-pi logarithmic pi
If the logarithm log in the calculation is based on 2, then the calculated information entropy is in bits. Words such as bytes, KB, MB and GB, which are widely used in computers and communications today, are all evolved from bits. The appearance of "bit" indicates that human beings know how to measure the amount of information.
Shannon's initial motivation was to eliminate the noise on the phone. He gave an upper limit on the communication rate. This conclusion was first used in telephone, then in optical fiber and now in wireless communication. Today, we can clearly make overseas calls or satellite calls, which is closely related to the improvement of communication channel quality.
Shannon spent most of her time at Bell Labs and MIT. People describe Shannon's life. He always works behind closed doors during the day and rides a unicycle to Bell Laboratories at night. His colleague D. Slepian said, "We all bring lunch to work and play math games on the blackboard after dinner, but Claude seldom comes over." He is always behind closed doors. However, if you want to find him, he will help you patiently. He can grasp the essence of the problem at once. He is really a genius. Of all the people I know, I only use this word for him. "
Claude shannon is not particularly well-known among the public, but he is one of the few scientists who can make our world communicate instantly. He is a member of the American Academy of Sciences, the American Academy of Engineering, the Royal Society and the American Philosophical Society. He won many honors and awards. For example, 1949 Morris Award, 1955 ballantine Award, 1962 Kelly Award, 1966 National Science Medal, IEEE Medal of Honor, 1978 Jaquard Award, 1983 Fritz Award. He has won numerous honorary degrees.
Both Bell Laboratories and MIT regard Shannon as the founder of information theory and digital communication era. It is he who corresponds the "truth" of Boolean algebra with the "on" and "off" of "false" circuits, which are represented by 1 and 0. This is an important link from theory to actual product design.