Current location - Education and Training Encyclopedia - Graduation thesis - ? An overview of the emergence and development of information theory
? An overview of the emergence and development of information theory
There are narrow sense and broad sense in information theory. Information theory in a narrow sense is a science that applies mathematical statistics to study information processing and information transmission. It studies the isomorphism of information transmission in communication and control systems, and how to improve the effectiveness and possibility of each information transmission system. The narrow sense information theory was founded by Shennong in 1948. Its main content is to study the problems of source, destination, transmission and coding, so it is mainly used for communication. Later, information theory developed rapidly, and Shennong's information theory was considered as a theory to study all problems, that is, generalized information theory. Information theory is a theory based on information. The so-called information, that is, all the changes around us perceived by human beings with the help of sensory organs, can be called information.

Practice has proved that everything in the world is in the process of movement and change, and in this process of movement and change, all kinds of information will be sent out, internal changes can be reflected to the outside, and external performance reflects the internal essence; The change of the whole can be reflected in the part, and the local change affects the whole. In order to survive in the world, human beings must know and use information, otherwise they cannot survive, such as preventing and avoiding natural disasters, inventing and creating tools, developing production and scientific experiments. , are inspired by some information. War is even more inseparable from information. For example, in the Zhou Dynasty, China used beacon towers to transmit war messages. All of these, in short, from the day when there were human beings, human beings lived in the ocean of information and could not live without information for a moment. The key problem is how to understand and use information to change their living conditions and create a better living environment. It is precisely according to the changes in the world environment that human beings adjust their actions and change their strategies to fight against nature. Therefore, human beings always take information as the center. The viewpoint of information theory is to summarize people's actions to the outside world as a process of information and information feedback. The reason why human beings can ensure normal living conditions and constantly improve their own living conditions is because they have the ability to obtain, use, maintain and transmit information. This ability to identify and use information, like other problems, is from simple to complex.

Human beings have the right to obtain information, but information is not unique to human beings, and other creatures also have the instinct to know and use information. As the saying goes, "people talk to animals" means this. Others, such as the protective color of some animals, hunting and so on. , are the performance of understanding and using information. Plants also have their own sense of information, such as biological clock, which is actually a reflection of information. Therefore, it can be said that all creatures in the world are related to information. The difference is that human beings have a wider and deeper understanding and utilization of information than other creatures, and have introduced creative ships, while other creatures just want to survive.

Human beings need information, and they also need to transmit information. The function of language is to transmit information, and it is also the best way to transmit information. However, due to the limitation of distance, language loses its transmission function in places where sound cannot reach, and is easily disturbed by external noise. Therefore, human beings invented writing, which is also a way to transmit information, but it also has certain limitations, so human beings have continuously created many methods to transmit information. The purpose of transmitting information is to make reasonable decisions, although human beings have a long history of understanding and using information, that is, information feedback. The real research on information theory has only a history of more than half a century, and its breakthrough point begins with the method of information transmission. 1922 Kachen expounded the theory of sideband problem and put forward the law of signal protection, that is, the spectrum broadening method in the process of signal modulation (coding). Can be said to be the forerunner of information theory.

192 1 year, Nyquist in America and Kaufman in Germany confirmed Carson's theory. 1928, hartle published "information transmission", which first proposed that information is codes, symbols and sequences, rather than the content itself, excluding subjective factors and achieving a conceptual breakthrough. He also put forward the concept of information quantity for the first time, and tried to describe it by numerical formula, which found a theoretical basis for information transmission and promoted the establishment of information theory. 1938, American Morse invented an efficient coded telegraph method, and mankind finally found a more advanced information transmission tool. During World War II from 65438 to 0940, because of the military significance of communication, Shennong, an American, began to study information theory. 1945, porter published the visual graphics of sound. 1947, he wrote visual language together with popular science and others, which made the conditions for the establishment of information theory more and more mature. 1948, Shennong published Mathematical Theory of Communication. Thus announcing the establishment of information theory. The main content of Shennong's information theory is to study source, destination, channel and coding. After the war, due to the need of communication and the rapid development of electronic technology, the further development of information theory has been promoted, and scholars in many countries have done a lot of research work and made great achievements.

Because of the achievements of information theory, it brings hope to many disciplines. People try to apply the basic theory of information theory to solve some difficult problems such as organization, semantics, hearing, neurology, physiology and psychology. 1950 held a conference on information theory in London, England, and received more than 20 papers, 6 of which were about the application of information theory in psychology and neurophysiology. 195 1 year, the American society of radio engineering recognized the new discipline of information theory and established the information theory group. The third information theory conference held in London from 65438 to 0955 covered anatomy, animal health, neurophysiology, neuropsychiatry and psychology. 65438-0956 Shennong talked about the development prospect of information theory: "Many concepts of information theory will be useful in some fields such as psychology and economics. 」

In 1950s, Englishman E.C. Cherley also made a historic investigation on information, and published two articles, History of Information Theory and Human Information. The 1950s witnessed the influence of information theory on various sciences. L. Brillouin, a French physicist living in the United States, extended informatics to the field of physics. His book Life-Thermodynamics and Cybernetics holds that: "In order to reliably apply similar concepts to basic problems related to life and intelligence, besides the old classical concept of physical entropy, some bold new generalizations are needed. 1956 published Science and Information Theory, trying to link information with concrete physical processes, and directly linking information entropy with thermodynamic entropy, and put forward the principle of negative entropy of information, holding that information and negative entropy are equivalent. In 1979, according to this principle, Carker discussed the efficiency of chemical steady-state function of human kidney, and drew a more practical conclusion.

The 1960s was a period of great construction based on the existing information theory, with emphasis on information and source code coding. During this period, information theory was popularized and applied to biology, and the neurophysiology was Ashby in the United States. He pointed out that "information theory is essentially a branch of combinatorial theory, and its essence is basically a number". In 1960s, people divided information into three types: narrow information theory, generalized information theory and generalized information theory.

In 1970s, due to the wide application of digital computers, the capability of communication system was greatly improved. How to use and process information more effectively has become an increasingly urgent problem, and people are increasingly aware of the importance of information. The concepts and methods of information have penetrated into various scientific fields, and it is urgent to break through the narrow scope of Shennong's information theory in order to promote the further development of many emerging disciplines. In 1970s, new progress was made in information transmission, and several source and channel coding theorems appeared under new conditions. Berg wrote a fairly complete book about information sources and coding. In terms of semantic problems and validity, Gao Aisi put forward the concept of "effective information" in 197 1 year. In 1978, on the basis of the additivity modified by verma et al., it is extended to the non-additivity "generalized effective information". There are also "semantic information" proposed by Carnap and others, "non-probabilistic information" proposed by Tingel, "relative information" proposed by Gemari and so on. After Chad put forward fuzzy mathematics in 1965, some people established the concepts of "entropy" and information on the basis of fuzzy set theory, and further tried to establish "fuzzy information theory". Others imagine to establish "algorithmic information theory" according to the information problems in computers. In natural science and philosophy, some people study information as a basic parameter. They believe that forms, structures, relationships and similar things are more basic than matter and energy. These different structures and relationships are characterized by information.

In a word, the progress of this decade has gone far beyond the scope of Shennong's information theory. As the great scholar Lang Gao pointed out in the article "Information Theory, New Direction and Unresolved Problems" published in 1975, "what Shennong carefully excluded from his contribution is now included". At present, information theory has been widely used in physics, chemistry, biology, psychology, management and other disciplines, and a science studying information is taking shape.