Current location - Education and Training Encyclopedia - Graduation thesis - Bandwidth and signal-to-noise ratio of Shannon formula
Bandwidth and signal-to-noise ratio of Shannon formula
In recent years, 5G has become a hot topic and is an excellent work belonging to this era. As we all know, communication technology is constantly improving and upgrading, and will eventually usher in 5G!

The fifth generation mobile communication is the latest generation mobile communication technology. The performance goals of 5G are to improve data rate, reduce delay, save energy, reduce cost, improve system capacity and connect large-scale equipment.

In practical application, communication technology is based on a theorem, which is Shannon's theory (English: Shannon's theory, also known as noise channel coding theorem). So what is Shannon theory? What role does it play in information theory?

Claude Elwood Shannon

Shannon theory.

In information theory, Shannon theory (i.e. noise channel coding theorem or Shannon limit) determines that for any given degree of noise pollution in a communication channel, almost error-free discrete data (i.e. digital information) can be communicated until the maximum rate can be calculated through the channel. This result was put forward by Cla+ude Shannon in 1948. Part of the theoretical basis is based on the early work and thoughts of harry nyquist and Ralph Hartley.

Generally speaking, in information theory, Shannon theory points out that although noise will interfere with the communication channel, data information can still be transmitted with any low error probability on the premise that the information transmission rate is less than the channel capacity. This surprising result, sometimes called the fundamental theorem of information principle, was first put forward by Claude elwood Shannon in 1948.

Channel capacity (channel capacity)

The channel capacity of a communication channel (also known as Shannon limit or Shannon capacity) refers to the theoretical maximum rate of error-free data that can be transmitted on the channel if the link suffers from random data transmission errors at a specific noise level. It was first described by Shannon (1948) and soon published in the book Mathematical Theory of Communication by Shannon and Warren Weaver (1949). This established the discipline of modern information theory.

In short, the channel capacity (also called Shannon limit) of a communication channel refers to the theoretical maximum transmission rate of the channel under the specified noise standard.

Figure 1 channel capacity

Shannon formula (Shannon formula)

Channel is a channel for transmitting information, and channel capacity describes the maximum capacity of error-free transmission of information in the channel, which can be used to measure the quality of the channel. 1948, Shannon gave the definition and calculation of channel capacity in his famous paper Mathematical Principles of Communication, that is, channel capacity is the upper bound of mutual information between input signal and output signal. The calculation of channel capacity follows Shannon formula, that is, C=Blog2( 1+S/N), which is also expressed as the maximum number of information bits that can be transmitted by the channel per second. Where b is the channel bandwidth, s is the average power of the transmitted signal, and n is the average power of the noise or interference signal.

For example, for an additive white Gaussian noise channel with S/N signal-to-noise ratio and B bandwidth, its channel capacity is C=Blog2( 1+S/N).

You should know that C=Blog2( 1+S/N) is the spectral efficiency of information transmitted by the channel, that is, the amount of information that can be transmitted in unit time and bandwidth, in bits? Hz- 1? S- 1. Improving the signal-to-noise ratio can improve the channel capacity, which can be achieved by suppressing noise or increasing the transmission power. If the SNR is infinite, the channel capacity tends to infinity. However, because there is always noise in the channel, the power of the transmitter cannot be infinite, so this will not happen. Increasing the channel bandwidth can also increase the channel capacity, but this increase is not infinite. If the noise power spectral density of the channel is N0, the noise power N=BN0 will also increase with the increase of the channel bandwidth b, remember that the maximum signal power is Es, and when the bandwidth is infinite, the limit of the channel capacity is:

Fig. 2 Calculation formula of infinite channel capacity

It can be seen that unilaterally increasing the bandwidth is not a good way to improve the channel capacity.

Channel capacity is the theoretical limit of channel's ability to transmit information. In various communication technologies at present, the actual channel throughput can be far less than this limit.

From the above example, we can know that in order to get faster network speed, we must improve the channel bandwidth and signal-to-noise ratio; The former refers to the maximum bandwidth that the signal can effectively pass through the channel, and the latter is the ratio of signal power to noise power. With this restriction, if you want to increase the network speed, you need to increase the channel bandwidth, or improve the signal-to-noise ratio, or both.

Learning skills

We can simply compare the information channel to the urban road. The traffic volume per unit time of this road is restricted by factors such as road width and speed. Under these constraints, the maximum traffic volume per unit time is called the limit value.

Next notice: What's the difference between Gigabit and 10 Gigabit? Why is there a gigabit difference?

With the advent of the 5G era, Gigabit and Gigabit are in full swing, so how to choose the current weak current wiring? Please look forward to "Deep Shark Know-it-all" to answer your questions.

refer to

Salem Bhatti. Channel capacity. Data communication network and distributed system D5 1- Basic communication and network. [ 2007- 1 1- 10 ]。 (The original content was archived on August 2, 20071).

Jim Leslev. The signal looks like noise! . Information and measurement, 2nd edition. [2007-11-10]. (The original content was archived on 201612-28).

Thomas m gay joey Thomas. Elements of information theory. New york John Wiley & Sons Company. In 2006.

New Knowledge in Encyclopedia | Shannon Formula and 5G. (The original content is archived on the website of the State Commission for Discipline Inspection of the Central Commission for Discipline Inspection? ,20 16- 12-28)。