Current location - Education and Training Encyclopedia - Graduation thesis - What does Shannon mean by "information", one of the foundations of information theory?
What does Shannon mean by "information", one of the foundations of information theory?
river shannon

Claude shannon (1916-2001)1916 was born in Toschi, Michigan, USA. Growing up in Gaylord, there were only 3,000 residents. My father is a judge in this town. Their father and son have the same name, claude elwood shannon. My mother is the headmaster of the middle school in the town. Her name is Mabel Wolfe Shannon. He grew up in a well-educated environment, but his parents didn't seem to have as much influence on his science as his grandfather. Shannon's grandfather was a farmer and inventor. He invented washing machines and many agricultural machinery, which had a direct impact on Shannon. In addition, the Shannon family is distantly related to the great inventor Thomas Alva Edison (1847- 193 1).

Shannon spent most of her time at Bell Labs and MIT. After becoming famous, Shannon and Mary Elizabeth Moore got married on March 27th, 1949/kloc-0. They met at Bell Labs, where Mary was a data analyst. They have four children: three sons Robert, James, Andrew Moore and a daughter Margarita Catherine. Later, I had two lovely granddaughters around me.

On February 24th, 2006, Shannon died in Medford, Massachusetts at the age of 85. The obituaries issued by Bell Laboratories and Massachusetts Institute of Technology honored Shannon as the founder of the information theory and digital communication era.

1936 Shannon obtained a bachelor's degree in mathematics and electrical engineering from the University of Michigan, and then entered the Massachusetts Institute of Technology for postgraduate study.

Shannon received a master's degree in electrical engineering from Massachusetts Institute of Technology on 1938, and her master's thesis was entitled "Symbolic analysis of relays and switching circuits". At that time, he had noticed the similarity between telephone switching circuit and Boolean algebra, that is, the "true" and "false" of Boolean algebra correspond to the "on" and "off" of the circuit system, which are all represented by 1 and 0. So he used Boolean algebra to analyze and optimize the switching circuit, which laid a theoretical foundation for digital circuits? 9? Are you sad? Professor howard gardner said, "This is probably the most important and famous master's thesis in this century."

1940, Shannon received a Ph.D. degree in mathematics from MIT, but his doctoral thesis was about human genetics, entitled Algebra of Theoretical Genetics. This shows that Shannon's scientific interests are very extensive, and later he published many influential articles in different disciplines.

While studying for his degree, he also spent part of his time studying differential analyzer with Professor Vannevar Bush. This analyzer is an early mechanical simulation computer used to obtain numerical solutions of ordinary differential equations. 194 1 year, Shannon published Mathematical Theory of Differential Analyzer, and he wrote: "Most results are given in the form of proving theorems. The most important thing is to deal with some conditions, some conditions can generate functions of one or more variables, and some conditions can make ordinary differential equations get solutions. Some precautions are also given, and the approximate value of the function (which cannot produce accurate values), the approximate value of the adjustment rate and the method of automatic control rate are given. "

194 1, Shannon is in & entering; T bell telephone company, and worked in bell laboratory until 1972, aged 24 to 55, 3 1 year. 1956 became a visiting professor at MIT, 1958 became a full professor, and 1978 retired.

People describe Shannon's life. He always works behind closed doors during the day and rides a unicycle to Bell Laboratories at night. His colleague D. Slepian wrote: "We all bring lunch to work and play math games on the blackboard after dinner, but Claude seldom comes over." He is always behind closed doors. However, if you want to find him, he will help you patiently. He can grasp the essence of the problem at once. He is really a genius. Of all the people I know, I only use this word for him. "

Shannon cooperated with John Riordan to publish a paper on the number of double terminals in series-parallel network in 1942. This paper generalizes the paper theory published by Percy A. Macmahon (1854-1929)1892 in Electrician. 1948, the establishment of information theory.

During the long years, he thought about many problems. I spent most of my time at MIT and Bell Laboratories, except for a year at Princeton Institute for Advanced Studies. It should be noted that during World War II, Dr. Shannon was also a famous password cracker (which reminds me of Dr. Turing, who is four years older than him). His deciphering team at Bell Laboratories mainly tracked German planes and rockets, especially when German rockets blitzed Britain. Shannon published another important paper "Communication Theory of Secret Systems" from 65438 to 0949. Based on this work practice, its significance lies in turning secret communication from art to science.

Shannon published a mathematical theory of communication in Bell System Technology Journal from 65438 to 0948. The paper is signed by Shannon and Vivo. Warren Weaver (1894- 1978) was then the director of the natural science department of Rockefeller Foundation, and he wrote a preface for the article. Later, Shannon was still engaged in technical work, while Weaver studied the philosophical problems of information theory. By The way, the indefinite article A was used when the paper was first published, and it was changed to the definite article The when it was included in the prose collection.

The concept of entropy

The important feature of Shannon's theory is the concept of entropy, which proves that entropy is equivalent to the uncertainty of information quantity. Entropy was once introduced by Boltzmann in the second law of thermodynamics, which we can understand as the disorder degree of molecular motion. Information entropy has a similar meaning. For example, in Chinese information processing, the static average information entropy of Chinese characters is relatively large, which is 9.65 bits in Chinese and 4.03 bits in English. This shows that Chinese is more complex than English, reflecting that Chinese is rich in meaning and concise in writing, but it is also difficult to deal with. A large information entropy means a large uncertainty. Therefore, we should study deeply and seek a deep breakthrough in Chinese information processing. We can't blindly think that Chinese characters are the most beautiful characters in the world, and thus come to the wrong conclusion that Chinese characters are the easiest to handle.

As we all know, quality, energy and information are three very important quantities.

It has long been known that the mass of substances measured by scales or balances is large? H, take (4) diluted dose ⒐ ⒐ ⒐ ⒐? Hey? In the mid-9th century, with the clarification of the mechanical equivalence of heat and the establishment of the law of conservation of energy, it gradually became clear. The word energy is their collective name, and the measurement of energy is solved by the emergence of new units such as calories and joules.

However, the knowledge about words, numbers, images and sounds has a history of thousands of years. But what are their common names and how to measure them uniformly were not put forward correctly until the end of 19, let alone how to solve them. At the beginning of the 20th century, with the development of telegraph, telephone, photo, television, radio and radar, the problem of how to measure the information in the signal was vaguely put on the agenda.

In 1928, R.V. H. Harley considered the problem of taking out n symbols from d different symbols to form a word. If each symbol has the same probability and is completely randomly selected, DN different words can be obtained. Take a concrete word from these words, which corresponds to an amount of information I. Hartley suggested that the amount of information should be expressed by the amount of N log D, that is, I = I=N log D, where log represents the logarithm with the base of 10. Later, Weiner, the founder of 1949 cybernetics, also studied the problem of measurement information and led it to the second law of thermodynamics.

But Shannon is the core figure who gives the basic mathematical model of information transmission. Shannon's paper "Mathematical Theory of Communication", which lasted for dozens of pages from 65438 to 0948, became a milestone in the formal birth of information theory. In his mathematical model of communication, he clearly put forward the problem of information measurement. He extended Hartley formula to the case of different probability pi, and obtained the famous formula for calculating information entropy h:

H=∑-pi logarithmic pi

If the logarithm log in the calculation is based on 2, then the calculated information entropy is in bits. Words such as bytes, KB, MB and GB, which are widely used in computers and communications today, are all evolved from bits. The appearance of "bit" indicates that human beings know how to measure the amount of information. Shannon's information theory has made a decisive contribution to defining the concept of information quantity.

In fact, Shannon's initial motivation was to eliminate the noise in the mobile phone. He gave the upper limit of communication speed. This conclusion was first used in telephone, then in optical fiber and now in wireless communication. Today, we can clearly make overseas calls or satellite calls, which is closely related to the improvement of communication channel quality.

Claude shannon is not particularly well-known among the public, but he is one of the few scientists and thinkers who enable our world to communicate immediately. He is a member of the American Academy of Sciences, the American Academy of Engineering, the Royal Society and the American Philosophical Society. He won many honors and awards. For example, 1949 Morris Award, 1955 ballantine Award, 1962 Kelly Award, 1966 National Science Medal, IEEE Medal of Honor, 1978 Jaquard Award, 1983 Fritz Award. There are countless honorary degrees he has won, so I won't repeat them here.

Today, we miss Shannon and should be familiar with his two great contributions: first, the concepts of information theory and information entropy; The other is symbolic logic and switch theory. We should learn from his scientific spirit of being curious, attaching importance to practice, pursuing perfection and never being satisfied, which is an important experience of his success.

References:

/question/ 12396945.html? si=3