rxx(m)= E[x(m+n)x *(n)](4- 1)
And the power spectrum Pxx(ejω) form a Fourier transform pair, that is
Fundamentals of geophysical information processing
If x(n) is ergodic, its autocorrelation function can be obtained from one of its sampling time series by time averaging method, that is
Fundamentals of geophysical information processing
In most applications, X(n) is a real signal, so the above formula can be written as
Fundamentals of geophysical information processing
In fact, usually only a limited number of sampled values (e.g., n values) of a random signal can be observed in the time domain, which can be expressed as
xN(n)={x(0),x( 1),…,x(N- 1)}={x(n),n=0, 1,…,N- 1}
Its autocorrelation function can only be estimated from these n sampled data, and biased estimation is often used.
Fundamentals of geophysical information processing
This is an asymptotically consistent estimation, which is called sampling autocorrelation function.
Fourier transform of sampling autocorrelation function is used as the estimation of power spectrum. This method was put forward by R.Blackman and J.Tukey in 1958, which is called the autocorrelation method of power spectrum estimation (BT method for short). This method needs to get the estimated autocorrelation function of limited observation data first, and then calculate the power spectrum according to equation (4-2). Before the Fast Fourier Transform (FFT) algorithm was put forward, it was the most popular power spectrum estimation method.
1965, Cooley and Tukey perfected the famous FFT algorithm, which improved the calculation speed of Fourier transform by two orders of magnitude, significantly reduced the amount of calculation, and made DFT transform widely used in various fields, especially in engineering practice. According to formula (4-5), it is the convolution operation of x(n) and x(-n), because
Fundamentals of geophysical information processing
If the Fourier transform of x(n) is x(EJω), then the Fourier transform of x(-n) is equal to x *(EJω). Perform Fourier transform on both ends of the formula (4-5) to obtain
Fundamentals of geophysical information processing
This shows that by directly performing discrete Fourier transform on random data, then taking the square of its amplitude, and then performing this operation on multiple samples, the average value is taken as the estimation of power spectrum, that is, Schuster periodogram. This kind of spectrum estimation has attracted wide attention because it does not need to calculate the autocorrelation function, but directly calculates the power spectrum.
Periodic diagram and autocorrelation method and their improved methods are called classical methods of power spectrum estimation, and periodic diagram and autocorrelation method are two basic methods of classical power spectrum estimation. Due to the appearance of FFT, periodogram and autocorrelation are often combined, and the steps are as follows:
(1) completes n zeros for xN(n) and finds;
(2) Through Fourier transform, we get | m |≤ m = n-1;
(3) For the windowed function v(m), at this time | m |≤ m < < n- 1, we get;
(4) the use of Fourier transform, i.e.
Fundamentals of geophysical information processing
The variance of power spectrum estimation obtained by periodogram method does not tend to zero with the increase of sample length. Surprisingly, no matter how long the data record is, the estimates obtained by periodogram method and autocorrelation method are not good power spectrum estimates. In fact, with the increase of recording length, the random fluctuation of these two estimates will become more serious! In addition, they have the following two inherent shortcomings that are difficult to overcome.
(1) The frequency resolution (the ability to distinguish two adjacent frequency components) is not high. Because their frequency resolution (Hz) is inversely proportional to the data recording length (seconds) (that is, Δ f = k/TP = k/nt, k is constant, Tp=NT is the data recording length, and t is the sampling period), it is generally impossible to obtain long data records in practical applications, that is, only limited data can be observed, and the unobserved data is regarded as 0. In this way, if there are only n observation data, and the signal is still strongly correlated with the data other than n, then the estimated power spectrum will have a great deviation.
(2) For limited observation data, it is equivalent to multiplying the signal by the rectangular window function in the time domain, so it is equivalent to convolving the real power spectrum with the sinc function in the frequency domain. Because sinc function is different from δ function, it has main lobe and side lobe, which makes convolution power spectrum different from real power spectrum. The main lobe of sinc function is not infinitely narrow, which leads to the spread of power spectrum to the nearby frequency domain, causing spectrum ambiguity and reducing the resolution of spectrum. At the same time, due to the sidelobe of sinc function, energy leaks into sidelobe (called sidelobe leakage), which causes interference between spectra. The sidelobe of strong signal power spectrum affects the detection of weak signal power spectrum. In severe cases, it will cause great distortion of the main sidelobe, so that weak signals can not be detected, or the sidelobe will be mistaken for signals, resulting in false signals. In order to improve the classical power spectrum estimation, various window functions can be used, but the result is that the width of the main lobe is increased in exchange for the reduction of the side lobe, so the low power spectrum resolution is the fatal disadvantage of the classical power spectrum estimation.
In order to overcome the above shortcomings, people have made long-term efforts and put forward methods such as averaging and windowing smoothing, which have improved the performance of classical power spectrum estimation to some extent. Practice has proved that the classical power spectrum estimation method based on Fourier transform is indeed practical for long data recording. However, the contradiction between frequency resolution and the stability of power spectrum estimation cannot be fundamentally solved by classical methods, especially in the case of short data records. This promotes the development of modern power spectrum estimation methods.
Modern power spectrum estimation method is mainly based on the parametric model of stochastic process, which is called parametric model method. Although the research and application of modern power spectrum estimation technology mainly began in the 1960s, in fact, time series models have long been adopted in non-engineering fields. For example, Yule used autoregressive model to predict and describe the development trend of economic time series in 1927, Walker used autoregressive model in 193 1, and Prony used exponential model to fit as early as 1795. In the field of statistics and numerical analysis, people have also adopted the model method.
Modern power spectrum estimation is mainly aimed at the poor resolution and variance performance of classical power spectrum estimation (periodogram and autocorrelation method). Inspired by linear predictive filtering in seismological research, 1967 Burg proposed the maximum entropy spectrum estimation method, which made the most meaningful exploration in improving the resolution. Parzen formally proposed an autoregressive spectrum estimation method in 1968. 197 1 year, Van der Bos proved that one-dimensional maximum entropy spectral estimation is equivalent to autoregressive spectral estimation. Prony method of spectrum estimation in 1972 is similar to autoregressive method in mathematics. At present, spectrum estimation based on autoregressive moving average model has higher frequency resolution and better performance than autoregressive model. The harmonic decomposition method proposed by Pisarenko in 1973 provides a reliable frequency estimation method. 198 1 year, Schmidt proposed the MUSIC algorithm for spectrum estimation. Therefore, modern power spectrum analysis mainly includes four methods: ARMA spectrum analysis, maximum likelihood, entropy spectrum estimation and feature decomposition. ARMA spectral analysis is a modeling method, that is, the power spectral density is estimated by establishing a model through a stationary linear signal process; Entropy spectrum estimation includes maximum entropy spectrum and minimum cross method; Feature decomposition, also known as feature construction method and subspace method, includes Pisarenko harmonic decomposition method, Prony method, MUSIC method and ESPRIT method (using rotation invariant technology to estimate parameters).
Modern power spectrum estimation research is still mainly based on one-dimensional power spectrum analysis, mostly based on second-order moments (correlation function, variance, power spectral density). However, because the power spectral density is a real function of frequency and lacks phase information, spectral estimation methods based on higher-order spectrum are attracting people's attention, especially the research of bispectrum estimation and trispectrum estimation has been highly valued. Other research such as multi-dimensional spectrum estimation and multi-channel spectrum estimation are also under development. These new methods are expected to be more applied in information extraction, phase estimation and nonlinear description.