Y=[y 1,y2,y3,y4]; %Y is a matrix of 14*4.
R=[r 1,r2,r3,R4]'; % The matrix here should be transposed into a matrix of 4* 1.
p = Y * R;
Problem supplement: Generally speaking, the sum of the weight coefficients is equal to 1, but it is not necessary to be equal to 1 here, because y 1 to y4 belong to different types, and the sum of unnecessary weights reflected in GDP is 1.
Question 2: What is the difference between correlation coefficient and covariance? They represent the degree of variation between variables. Covariance is a statistic obtained by multiplying the deviation of the average value of variable X by the deviation of the average value of variable Y and averaging. Although the degree of * * * change of X and Y can be expressed, the units of X and Y may be different, so the result obtained by directly multiplying the deviation of their average value may be very different. Therefore, it is necessary to unify the units, that is, to eliminate the units of X and Y by giving covariance a separate punishment.
Because the correlation coefficient is obtained by dividing covariance by the standard deviation of two variables, the correlation coefficient is a standardized variable, while covariance is a nonstandard variable.
Question 3: What does the point before the correlation coefficient in the correlation coefficient matrix mean? If the value of KMO is 0.5, it means that the factor analysis is effective and can be carried out. In addition, if the correlation coefficient matrix of P0.00 1 of Bartley test is not identity matrix, we can extract the least number of factors and explain most of the variance at the same time, that is, the validity is ok.
Question 4: What is the difference between correlation coefficient matrix and covariance matrix? Correlation coefficient matrix: a matrix representing the correlation between variables, without dimensions.
Covariance matrix: a matrix representing the correlation between variables, without eliminating dimensions.
Question 5: The meaning of correlation coefficient There are the following kinds of correlation coefficients:
1, simple correlation coefficient: also called correlation coefficient or linear correlation coefficient. Generally represented by the letter R, it is used to measure the linear correlation between quantitative variables.
2. Complex correlation coefficient: also called multiple correlation coefficient. Complex correlation refers to the correlation between dependent variable and multiple independent variables. For example, there is a complex correlation between the demand of a commodity and its price level and the income level of employees.
3. Partial correlation coefficient: also called partial correlation coefficient. Partial correlation coefficient reflects the correlation between one variable and another variable after correcting other variables, and the meaning of correction can be understood as assuming that all other variables take the average value. Hypothesis test of partial correlation coefficient is equivalent to T test of partial regression coefficient. Hypothesis test of complex correlation coefficient is equivalent to variance analysis of regression equation.
4. Canonical correlation coefficient: Firstly, the principal component analysis is carried out on each group of original variables to obtain new linear independent comprehensive indicators, and then the correlation between the original two groups of variables is studied by using the linear correlation of comprehensive indicators between the two groups.
5. The determinable coefficient is the square of the correlation coefficient. Significantness: the greater the determinable coefficient, the higher the degree of independent variables' explanation of dependent variables, and the higher the percentage of changes caused by independent variables in the total changes. The denser the observation points near the tropic of cancer.
Question 6: Introduction to Correlation Matrix Correlation matrix, also called correlation coefficient matrix, consists of correlation coefficients among columns of the matrix. That is to say, the elements in the I-th row and the J-th column of the correlation matrix are the correlation coefficients in the I-th and J-th columns of the original matrix.