Signals and Signal Spaces (779448), страница 2
Текст из файла (страница 2)
This canbe seen from+d(2,2A2 ==112 -42(2,4- (2,G)-( G ,2) + ( G ,2,)= 2 1 1 2 1 1 2 - 2 % { ( G ,2))(1.38)= 2 1 1 2 1 1 2 - 2 %{?fx(r)}.With increasing correlation the distance decreases.21n this section, we freely use the properties of the Fourier transform. For more detailon the Fourier transform and Parseval’s theorem, see Section 2.2.91.2. Energy Density and CorrelationSimilarly, the cross correlation functionccr,",(r) =[y(t+ r ) z*(t)d t(1.39)J -00and the corresponding cross energy density spectrumFccS,",(W) =I-,r,E,(r) C j W Td r ,(1.40)(1.41).Fy(.)may be viewed as a measure of the similarityare introduced, wherebetween the two signals z ( t ) and y T ( t ) = y(t 7).1.2.2+Discrete-Time SignalsAll previous considerations are applicable to discrete-time signals z ( n )as well.The signals z ( n ) may be real or complex-valued.
As in the continuous-timecase, we start the discussion with the energy of the signal:00(1.42)According to Parseval's relation for the discrete-time Fourier transform, wemay alternatively compute E, from X ( e j w ) : 3(1.43)The term IX(ejW)12in (1.43) is called the energy density spectrum of thediscrete-time signal. We use the notationS,E,(ejw)= IX(ejW)12.(1.44)The energy density spectrum S,",(ej") is the discrete-time Fourier transformof the autocorrelation sequencec00?-:,(m) =+z*(n)z(n m ) .3See Section 4.2 for more detail on the discrete-time Fourier transform.(1.45)10Chapter 1 .
Signals and Signal SpacesWe havecMm=-cc(1.46)5r,E,(m) = G1I T S"F z ( e j w )ejwm dw.Note that the energy density may also be viewed as the product X ( z ) X ( z ) ,evaluated on the unit circle ( z = e j w ) , where X ( z ) is the z-transform of z ( n ) .The definition of the cross correlation sequence iscccr,E,(m)=y ( n + m ) z*(n).(1.47)n=-ccFor the corresponding cross energy density spectrum the following holds:cc(1.48)m=-mthat is(1.49)1.3RandomSignalsRandom signals are encountered in all areas of signal processing.
For example,they appear as disturbances in the transmission of signals. Even the transmitted and consequently also the received signals in telecommunications areof random nature, because only random signals carry information. In patternrecognition, the patterns that are tobe distinguished are modeled as randomprocesses. In speech, audio, and image coding, the signals to be compressedare modeled as such.First of all,one distinguishes between randomvariables and randomprocesses.
A random variable is obtained by assigning a real orcomplexnumber to each feature mi from a feature set M . The features (or events)occur randomly. Note that the featuresthemselves may also be non-numeric.If one assigns a function iz(t)to each feature mi, then the totality of allpossible functions is called a stochastic process. The features occur randomlywhereas the assignment mi + i z ( t )is deterministic.
A function i z ( t )is calledthe realization of the stochasticprocess z ( t ) .See Figure 1.1for an illustration.111.3. Random Signalst"31\(b)Figure 1.1. Random variables (a) and random processes (b).1.3.1Properties of RandomVariablesThe properties of a real random variable X are thoroughly characterized byits cumulative distribution function F,(a) and also by its probability densityfunction (pdf) p,(.).The distribution states the probability P with whichthe value of the random variable X is smaller than or equal to a given valuea:F,(a) = P ( x a ) .(1.50)Here, the axioms of probability hold, which state thatlim F,(a) = 0,a+--00lim F,(a) = 1,a+wF,(al)5 F,(a2)fora15 a2.(1.51)12Chapter 1 . Signals and Signal SpacesGiven the distribution, we obtain the pdf by differentiation:(1.52)Since the distribution is a non-decreasing function, we haveJoint Probability Density.
The joint probability density p,,,,,two random variables 21 and 22 is given byPZ1,22(tl,t22) =pz,(t1)([l,PZZ1X1(t221t1),&) of(1.54)where pz,lzl (52 I&) is a conditional probability density (densityof 2 2 providedx1 has taken onthe value 51). If the variables 2 1 and 22 are statisticallyindependent of one another, (1.54) reduces toP m , m ([l,t2)= p,,(t1) p,,(&).(1.55)The pdf of a complex random variable is defined as the jointdensity of itsreal and imaginary part:Moments. The properties of a random variable are often describedmomentsm?) = E{Ixl"} .by its(1.57)Herein, E {-} denotes the expected value (statistical average). An expectedvalue E {g(z)}, where g ( x ) is an arbitrary function of the random variable x,can be calculated from the density asE {dxt.)}=Iccg(<) P X ( 5 ) d t .(1.58)-CQFor g(x) = x we obtain the mean value (first moment):lcctCQmzFor g(%) ==E{x}=Pz(5) dt.we obtainthe average power(second moment):(1.59)131.3.
Random SignalsThe variance (second central moment) is calculated with g(x) = Ix - mx12 asccd=E{ Ix - mXl2}=-cc15 - m,I2 P,(<) d5.The following holds:222U, = S, - m,.(1.61)(1.62)Characteristic Function. The characteristic function of a random variable(1.63)which means that, apart from the sign of the argument, it is the Fouriertransform of the pdf. According tothe momenttheorem of the Fouriertransform (see Section 2.2), the moments of the random variable can alsobe computed from the characteristic function as(1.64)1.3.2Random ProcessesThe startingpoint for the following considerations is a stochastic process x ( t ) ,from which the randomvariables xtl ,x t z ,. . .
, xi, with xtk = x(tk) are takenattimes tl < t z < . . . < t,, n E Z. The properties of these random variables are2 z t n (a1,a ~. .,. ,an).Then a secondcharacterized by their joint pdf p z t 1 , Z t,...,set of random variables is taken from theprocess x ( t ) ,applying a timeshift T:xtl+T,xtz+T,.. . , ~ t , +with~x t k + r = x(tkT). If the joint densities of bothsets are equal for all time shifts T and all n, that is, ifwe have+then we speak of a strictly stationary process, otherwise we call the processnon-stationary.Autocorrelation and Autocovariance Functions of Non-StationaryProcesses.
Theautocorrelation function of a general random process isdefined as a second-order moment:(1.66)14Chapter 1 . Signals and Signal Spaceswhere z1 = z(t1) and22= x*(tz).Basically, the autocorrelation function indicates how similar the process isat times tl and t2, since for the expected Euclidean distance we haveThe autocovariance function of a random process is defined as(1.67)where mtk denotes the expected value at time tk; i.e.mtk=E{Z(tk)}(1.68)‘Wide-Sense Stationary Processes.
There areprocesses whose mean valueis constant and whose autocorrelation function is a function of tl - t2. Suchprocesses are referred to as “wide-sense stationary”, even if they are nonstationary according to the above definition.Cyclo-Stationary Process. If a process is non-stationary according to thedefinition stated above, but if the properties repeatperiodically, then we speakof a cyclo-stationary process.Autocorrelation and Autocovariance Functions of Wide-Sense Stationary Processes. In the following we assume wide-sense stationarity, sothat the first and second moments are independent of the respective time.Because of the stationarity we mustassume that the process realizationsare not absolutely integrable, and that their Fourier transforms do not exist.Since in the field of telecommunications one also encounters complex-valuedprocesses when describing real bandpass processes in the complex baseband,we shallcontinue by looking at complex-valued processes.
Forwide-sensestationary processes the autocorrelation function (acf) depends only on thetime shift between the respective times; it is given byT,,(T)For21= z(tTzz(.)= E { z * (zt()t+ T)} .+ T) and 2 2 = z * ( t ) ,the expected value E=E{X1z2} =(1.69){e}can be written as(1.70)151.3. Random SignalsThe maximum of the autocorrelation function is located at r = 0, where itsvalue equals the mean square value:Furthermore we have r,,(-r)= r i z (7).When subtracting the meanprior computing theautocorrelationfunction,we get the autocovariancefunctionc,,(r)= E { [ x * ( t) 4 1 [x(t ). - m,])(1.73)+PowerSpectralDensity.The powerspectral density, or power densityspectrum, describes the distribution of power with respect to frequency. Itis defined as the Fourier transform of the autocorrelation function:CQS,,(w) = ~ m r , , ( r )e-jwT d r(1.74)$(1.75)This definition is based on the Wiener-Khintchine theorem, which states thatthe physically meaningful power spectral density given by(1.76)withtX T ( W ) t)z ( t ) rect(-),Tandrect(t) =0.51, for It10, otherwiseis identical to the power spectral density given in (1.74).Taking (1.75) forT = 0,S;we obtain= r Z Z ( 0=)LJ27r-CQSZZ(w)dw.(1.77)16Chapter 1 .
Signals and Signal SpacesCross Correlation andCrossPowerSpectral Density. The crosscorrelation between two wide-sense stationary randomprocesses z ( t )and y ( t )is defined asTxy (7) =E {X* ( t ) Y (t + 7)} .(1.78)The Fourier transform of rXy(7) is the cross power spectral density, denotedas Szy( W ) . Thus, we have the correspondence(1.79)Discrete-Time Signals. The following definitions for discrete-time signalsbasically correspond to those for continuous-time signals; the correlation andcovariance functions, however, become correlation and covariance sequences.For the autocorrelation sequence we haver x x ( m )= E {x*(n)x ( n+m)}.(1.80)The autocovariance sequence is defined as(1.82)The discrete-time Fourier transform of the autocorrelation sequence is thepower spectral density (Wiener-Khintchine theorem).
We haveM(1.83)m=-cc(1.84)171.3. Random SignalsThe definition of the cross correlation sequence ism=--00A cross covariance sequence can be defined asCorrelation Matrices. A u t o and cross correlation matrices are frequentlyrequired. We use the following definitionsR,,= E{xxH},Rzy= E{YXH},(1.89)whereX=+[z(n), z(n l),. . . , z(n+ NZ - 1 ) I T ,(1.90)Y = [ y ( n ) , y ( n+ l),. . . ,Y(n + Ny- IllT.The terms x x H and gxH are dyadic products.For the sake of completeness it shall be noted that the autocorrelationmatrix R,, of a stationary process z(n) has the following Toeplitz structure:.(1.91)18Chapter 1 .
Signals and Signal SpacesHere, the propertyr,,(-4= c,(47(1.92)which is concluded from (1.80) by taking stationarity into consideration, hasbeen used.If two processes x ( n ) and y(n) are pairwise stationary, we have.zy(-i)= .f,(i),(1.93)and thecross correlation matrix R,, = E { y X"} has thefollowing structure:Auto and cross-covuriunce matrices can be defined in an analog way byreplacing the entries rzy(m)through czy(m).Ergodic Processes. Usually, theautocorrelation function is calculatedaccording to (1.70) by taking the ensemble average.
An exception to thisrule is the ergodic process, where the ensemble average can be replaced by atemporal average. For the autocorrelation function of an ergodic continuoustime process we have(1.95)where iz(t)is an arbitrary realization of the stochastic process. Accordingly,we get(1.96)for discrete-time signals.Continuous-Time White Noise Process.