Current location - Education and Training Encyclopedia - Graduation thesis - What does variance mean in junior high school?
What does variance mean in junior high school?
Variance is a measure of dispersion when probability theory and statistical variance measure random variables or a set of data.

Extended knowledge

First, about variance.

Variance in probability theory is used to measure the deviation between random variables and their mathematical expectations (that is, the mean value). The variance (sample variance) in statistics is the average value of the square of the difference between each sample value and the average value of all sample values. In many practical problems, it is of great significance to study variance or deviation. Variance is a measure of the difference between the source data and the expected value.

The word "variance" was first put forward by ronald fisher in his paper "Genetic Relationship Supported by Mendel Genetics".

Second, the statistical significance

When the data distribution is scattered (that is, the data fluctuates greatly around the average value), the sum of squares of differences between each data and the average value is large, and the variance is large; When the data distribution is concentrated, the sum of squares of the differences between each data and the average value is very small. Therefore, the greater the variance, the greater the data fluctuation; The smaller the variance, the smaller the data fluctuation.

The average value of the sum of squares of the difference between the data in the sample and the average value of the sample is called sample variance; The arithmetic square root of sample variance is called sample standard deviation. Sample variance and sample standard deviation are both measures of sample fluctuation. The greater the sample variance or standard deviation, the greater the fluctuation of sample data.

Variance and standard deviation are the most important and commonly used indicators to measure discrete trends. Variance is the square of variance of each variable value and the average of its mean, which is the most important method to measure the dispersion degree of numerical data.

The difference between standard deviation and variance is that the calculation unit of standard deviation and variable is the same, which is clearer than variance, so we often use standard deviation more in analysis.