Current location - Education and Training Encyclopedia - Graduation thesis - What is entropy?
What is entropy?
The general understanding of entropy is as follows:

The popular explanation is that entropy is an indicator to measure the degree of chaos in our world. The closer to chaos, the more chaotic things are, and the greater the entropy is. On the contrary, the entropy is small. Any isolated system always tends to change from high order to low order. This is the principle of entropy increase. It is difficult to make the smell and wrinkles of a piece of paper dissipate gradually.

The concept of entropy was put forward:

1877, Boltzmann put forward the statistical physics explanation of entropy. In a series of papers, he proved that the macroscopic physical properties of the system can be regarded as the equal probability statistical average of all possible microscopic states.

For example, consider an ideal gas in a container. The microscopic state can be expressed by the position and momentum of each gas atom. All possible microscopic states must meet the following conditions: the positions of all particles are within the volume range of the container; The sum of kinetic energy of all atoms is equal to the total energy of gas.