Spectral Windows

Spectral windows have an established role in the analysis of random noise data (11-13). They are obtained by averaging power spectrum results at several (usually three) adjacent frequencies in order to improve over the (sinx)/x filtering that is inherent in Fourier analysis (see Section 2.11). A common spectral window is the Hanning window, which is obtained as follows:

Подпись: (4.10.1)ФЩ) = *Ф(Юі-1) + + їф(соі+1)

where ф'(сОі) is the modified power spectrum estimate at frequency со{, and ф(сОі) the original power spectrum estimate at frequency cu(. The effective filtering inherent in the original estimate and in the modified estimate are shown in Fig. 4.15. It is evident that the use of a Hanning window decreases the contribution from frequencies far from the analysis frequency, but increases the contribution of nearby frequencies.

image78

(Harmonic Number) — (Analysis Harmonic Number) Fig. 4.15. Hanning and Hamming spectral windows.

Another common spectral window is the Hamming window. It is given by

ф'((о,) = 0.27ф(щ_,) + 0.46#»,) + 0.27ф(соі+,) (4.10.2)

The filtering obtained with the Hamming window also appears in Fig. 4.15. We see that the Hamming window has a larger central lobe than the Han­ning window, but smaller side lobes.

The advantages of spectral windows in analyzing random noise data are obvious. The noise has a continuous spectrum, and the analysis must pick out the information in the desired frequency range and eliminate informa­tion from other frequencies.

The situation with periodic data is quite different. For the case of a perfect, noise-free periodic signal, Fourier analysis at a harmonic frequency auto­matically places the other harmonics at null points in the filter, and spectral windows are totally unnecessary. However, spectral windows may be help­ful in nonideal situations where background noise is a problem or where analysis of a non-integral number of periods is unavoidable (see Section 4.8). In some cases, narrow-band noise from sources such as 60-Hz pickup or from mechanical vibrations can be a problem. These signals can have a large amplitude relative to the periodic test signals. It may be advantageous to reduce the side lobes of the effective filter in Fourier analysis to reduce the effect of narrow-band noise. Spectral windows can help when a nonintegral number of periods is unavoidable because the resulting filtering effect weights frequencies rather evenly near the analysis frequency and reduces the weights of frequencies far from the analysis frequency.

Spectral windows may be used in tests using multiple periods of a periodic signal. For example, let us assume that two periods of the input are used. If the period is T, then the data record length is 2T, and the harmonics based on the length of the data record are at integer multiples of 1/(2T). We can average the Fourier coefficients using formulas similar to the Hanning or Hamming formulas:

F{oak} = 0.5F{a>k-i} + F{ojk} + 0.5F{e>*+1} (4.10.3)

(Denote location of harmonics)

image79

(Harmonic Number) -(Analysis Harmonic Number)

Fig. 4.16. Hanning and Hamming windows for two periods of periodic data.

or

F'{(ok} = 0.587F(a>t_ j} + F{(ok} + 0.587F{<ut+(4.10.4)

The resulting filtering effect is shown in Fig. 4.16. Since the Fourier coeffi­cients at <ut_! and cok+ x are zero, F'{wk} is identical to F{wk} if there is no noise.

This type of averaging to achieve better filtering can be extended if more than two periods are analyzed. For example, if three periods are used, then the following five-frequency averaging procedure minimizes the contribu­tion from the side lobes of the filter.

F'{wk} = 0.095F{<ut_2} + 0.595F{cok_1} + F{cok} + 0.595F{<ut+1}

+ 0.095F{a>t + 2} (4.10.5)

The filtering associated with this procedure is shown in Fig. 4.17. Clearly, it does a better job than Hanning or Hamming in suppressing side lobes of

image80

(Harmonic Number) — (Analysis Harmonic Number) Fig. 4.17. Five-frequency spectral windows.

the filter. The price paid for this is a broadening of the central lobe, but this should not be a serious problem in most applications with periodic signals.

A practical procedure for using the five-frequency window is:

1. Select at least three periods of data for analysis.

2. Fourier analyze at a harmonic that is nonzero (a harmonic based on the period of the signal T and also on the total record length).

3. Analyze at two harmonics on each side of the nonzero harmonic. Since these are harmonics based on the total record length, the Fourier analysis is allowable, but since these frequencies are not harmonics of the original signal, the Fourier coefficients should be zero (except for the effect of noise).

4. Average the calculate Fourier coefficients as follows:

F{cok} = 0.095F{fl>*_2} + 0.595F{co(l_1} + F{a}k} + 0.595F{a>i+1}

+ 0.095F{a>i + 2} (4.10.6)

This is done independently for input and output signals, and the frequency response is obtained by forming the ratio of the modified output transform to the modified input transform.

Spectral windows can also be applied to the data before it is Fourier analyzed. I n this case, each data point is multiplied by a factor before Fourier transformation. For example, the functions in the table can be multiplied by the data records to give the desired spectral windows, where Cl5 C2, and C3 are constants that determine the area under the filter. They are immaterial in frequency response tests where identical windows are applied to input and output signals.

Spectral window Function to be multiplied by data record

Hanning C,[ 1 — cos(2«/T)]

Hamming C2[ 1 — 0.8519 cos(2;rt/T)]

Five-frequency C2[l — 1.19 cos(2;rt/T) + 0.19 cos(4;rt/T)]

It may appear that it is simpler to apply the windows to the time-domain data. This may not be true because it requires N multiplications for N data points, and each of the factors for the window function must be calculated or stored.

It may be noted in the time domain representations of the spectral windows that they force the function to have the same value at the start of a period (t = 0) and at the end of a period (t = T). This may be used to eliminate some of the problem with drift.

4.11. Correlation Functions

The cross-correlation function between a periodic input l(t) and the resulting output 0(f) is

/■772

C12(t)=1/T l(t)0(t + z)dt (4.11.1)

Подпись: ■ 772J — 7

In Section 2.7, it was shown that for periodic signals

Подпись:f{C,2( Т)Ц =

If C* is the Fourier transform of the input, and Dk is the Fourier transform of the output, then

Dk = G(jok)Ck (4.11.3)

where G(ja>k) is the system frequency response at frequency wt, and

F{Cl2(z)}mk = CkC_kG(jwk) (4.11.4)

Then

C12(t)= F-l{CkC-kG(ju)k)} (4.11.5)

We may use the convolution theorem to interpret this. The result is

С12(т)=[ h(p)C! [(t — p) dp (4.11.6)

J о

where h(p) = F~ ‘{GOco)}, and C{l(p) is the autocorrelation function of the input (inverse Fourier transform of the input power spectrum). If the input signal had an autocorrelation function that was a delta function, then the result would be

C12(t) = /z(t) (4.11.7)

This indicates that the impulse response could be obtained by determining the cross-correlation function between input and output for a test that used an input signal whose autocorrelation function was a delta function. Such a signal would have a power spectrum that was constant over all frequencies.

No periodic signal can have this property, but several useful signals have autocorrelation functions that approximate delta functions and power spectra that are quite flat over a broad frequency range. For example, the PRBS (see Fig. 3.3), the PRTS (see Fig. 3.9) and the n sequence (see Fig. 3.8) all have autocorrelation functions with a series of spikes. These spikes become sharper as the number of bits in the sequence increases, giving a better approximation to a delta function. For these signals,

Подпись:C12(t) s h(x)

The approximation is due to the departure of the autocorrelation function from a true delta function.

It is possible to obtain the response of the system to any input using the convolution integral if the impulse response is known. A response of par­ticular interest is the step response. This may be obtained by integrating the impulse response. This approach may be preferred over a simple test involv­ing a step input because greater accuracy is possible. This is because an impulse response (and the resulting step response) is obtained from analysis of multiple periods of data in a way that discriminates against noise errors. There is no way to achieve a comparable enhancement of the desired step response in a direct step response test.

A great deal of work has been devoted to impulse response measurements using input signals having inpulse-like autocorrelation functions. Tech­niques have been developed (14) that permit a correction of the results to account for the deviation of the autocorrelation function from true delta- function behavior. Since the emphasis in this book is on frequency response measurements, these techniques will not be described here. Also, the reader should note that there is no need for corrections in the frequency domain.