Letdenotetherandomoutcomeofanexperiment.Toevery_第1頁(yè)
Letdenotetherandomoutcomeofanexperiment.Toevery_第2頁(yè)
Letdenotetherandomoutcomeofanexperiment.Toevery_第3頁(yè)
Letdenotetherandomoutcomeofanexperiment.Toevery_第4頁(yè)
Letdenotetherandomoutcomeofanexperiment.Toevery_第5頁(yè)
已閱讀5頁(yè),還剩44頁(yè)未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、1 Let denote the random outcome of an experiment. To every such outcome suppose a waveform is assigned. The collection of such waveforms form a stochastic process. The set of and the time index t can be continuous or discrete (countably infinite or finite) as well. For fixed (the set of all experime

2、ntal outcomes), is a specific time function. For fixed t, is a random variable. The ensemble of all such realizations over time represents the stochastic ),(tX k S i ),( 11i tXX ),(tX PILLAI/Cha 14. Stochastic Processes t 1 t 2 t ),( n tX ),( k tX ),( 2 tX ),( 1 tX Fig. 14.1 ),(tX 0 ),(tX Introducti

3、on 2 process X(t). (see Fig 14.1). For example where is a uniformly distributed random variable in represents a stochastic process. Stochastic processes are everywhere: Brownian motion, stock market fluctuations, various queuing systems all represent stochastic phenomena. If X(t) is a stochastic pro

4、cess, then for fixed t, X(t) represents a random variable. Its distribution function is given by Notice that depends on t, since for a different t, we obtain a different random variable. Further represents the first-order probability density function of the process X(t). ),cos()( 0 tatX )(),(xtXPtxF

5、X ),(txFX (14-1) (14-2) PILLAI/Cha (0,2 ), dx txdF txf X X ),( ),( 3 For t = t1 and t = t2, X(t) represents two different random variables X1 = X(t1) and X2 = X(t2) respectively. Their joint distribution is given by and represents the second-order density function of the process X(t). Similarly repr

6、esents the nth order density function of the process X(t). Complete specification of the stochastic process X(t) requires the knowledge of for all and for all n. (an almost impossible task in reality). )(,)(),( 22112121 xtXxtXPttxxFX(14-3) (14-4) ), ,( 2121nn tttxxxf X ), ,( 2121nn tttxxxf X niti ,

7、, 2 , 1 , PILLAI/Cha 2 1212 1212 12 (, ,) (, ,) X X Fx x t t fx x t t xx 4 Mean of a Stochastic Process: represents the mean value of a process X(t). In general, the mean of a process can depend on the time index t. Autocorrelation function of a process X(t) is defined as and it represents the inter

8、relationship between the random variables X1 = X(t1) and X2 = X(t2) generated from the process X(t). Properties: 1. 2. (14-5) (14-6) * 1 * 212 * 21 )()(),(),(tXtXEttRttR XXXX (14-7) . 0| )(|),( 2 tXEttRXX PILLAI/Cha (Average instantaneous power) ( )( )( , ) X tE X tx fx t dx * 121212121212 ( ,)( )(

9、) (, ,) XXX Rt tE X t Xtx x fx x t t dx dx 5 3. represents a nonnegative definite function, i.e., for any set of constants Eq. (14-8) follows by noticing that The function represents the autocovariance function of the process X(t). Example 14.1 Let Then . )(for 0| 1 2 n i ii tXaYYE )()(),(),( 2 * 12

10、121 ttttRttC XXXXXX (14-9) .)( T T dttXz T T T T T T T T dtdtttR dtdttXtXEzE XX 2121 212 * 1 2 ),( )()(| (14-10) n ii a 1 ),( 21 ttRXX n i n j jiji ttRaa XX 11 * . 0),( (14-8) PILLAI/Cha 6 Similarly , 0sinsincoscos )cos()()( 0 0 0 EtaEta taEtXEt X ).(cos 2 )2)(cos()(cos 2 )cos()cos(),( 210 2 210210

11、2 2010 2 21 tt a ttttE a ttEattRXX (14-12) (14-13) Example 14.2 ).2 , 0( ),cos()( 0 UtatX(14-11) This gives PILLAI/Cha 2 0 .sin0coscos since 2 1 EdE 7 Stationary Stochastic Processes Stationary processes exhibit statistical properties that are invariant to shift in the time index. Thus, for example,

12、 second-order stationarity implies that the statistical properties of the pairs X(t1) , X(t2) and X(t1+c) , X(t2+c) are the same for any c. Similarly first-order stationarity implies that the statistical properties of X(ti) and X(ti+c) are the same for any c. In strict terms, the statistical propert

13、ies are governed by the joint probability density function. Hence a process is nth-order Strict-Sense Stationary (S.S.S) if for any c, where the left side represents the joint density function of the random variables and the right side corresponds to the joint density function of the random variable

14、s A process X(t) is said to be strict-sense stationary if (14-14) is true for all ), ,(), ,( 21212121 ctctctxxxftttxxxf nnnnXX (14-14) )( , ),( ),( 2211nn tXXtXXtXX ).( , ),( ),( 2211 ctXXctXXctXX nn . and , 2 , 1 , , , 2 , 1 ,canynniti PILLAI/Cha 8 For a first-order strict sense stationary process,

15、 from (14-14) we have for any c. In particular c = t gives i.e., the first-order density of X(t) is independent of t. In that case Similarly, for a second-order strict-sense stationary process we have from (14-14) for any c. For c = t2 we get ),(),(ctxftxf XX (14-16) (14-15) (14-17) )(),(xftxf XX (

16、)( ), E X tx f x dxa constant. ), ,(), ,( 21212121 ctctxxfttxxf XX ) ,(), ,( 21212121 ttxxfttxxf XX (14-18) PILLAI/Cha 9 i.e., the second order density function of a strict sense stationary process depends only on the difference of the time indices In that case the autocorrelation function is given

17、by i.e., the autocorrelation function of a second order strict-sense stationary process depends only on the difference of the time indices Notice that (14-17) and (14-19) are consequences of the stochastic process being first and second-order strict sense stationary. On the other hand, the basic con

18、ditions for the first and second order stationarity Eqs. (14-16) and (14-18) are usually difficult to verify. In that case, we often resort to a looser definition of stationarity, known as Wide-Sense Stationarity (W.S.S), by making use of . 21 tt . 21 tt (14-19) PILLAI/Cha * 1212 * 12121212 * 12 ( ,

19、)( )( ) (,) ()( )(), XX X XXXXXX Rt tE X t Xt x x fx xtt dx dx RttRR 10 (14-17) and (14-19) as the necessary conditions. Thus, a process X(t) is said to be Wide-Sense Stationary if (i) and (ii) i.e., for wide-sense stationary processes, the mean is a constant and the autocorrelation function depends

20、 only on the difference between the time indices. Notice that (14-20)-(14-21) does not say anything about the nature of the probability density functions, and instead deal with the average behavior of the process. Since (14-20)-(14-21) follow from (14-16) and (14-18), strict-sense stationarity alway

21、s implies wide-sense stationarity. However, the converse is not true in general, the only exception being the Gaussian process. This follows, since if X(t) is a Gaussian process, then by definition are jointly Gaussian random variables for any whose joint characteristic function is given by )(tXE (1

22、4-21) (14-20) ),()()( 212 * 1 ttRtXtXE XX )( , ),( ),( 2211nn tXXtXXtXX PILLAI/Cha n ttt, 21 11 where is as defined on (14-9). If X(t) is wide-sense stationary, then using (14-20)-(14-21) in (14-22) we get and hence if the set of time indices are shifted by a constant c to generate a new set of join

23、tly Gaussian random variables then their joint characteristic function is identical to (14-23). Thus the set of random variables and have the same joint probability distribution for all n and all c, establishing the strict sense stationarity of Gaussian processes from its wide-sense stationarity. To

24、 summarize if X(t) is a Gaussian process, then wide-sense stationarity (w.s.s) strict-sense stationarity (s.s.s). Notice that since the joint p.d.f of Gaussian random variables depends only on their second order statistics, which is also the basis ),( ki ttCXX 1, ()( ,)/2 12 (,) XX nn kkikik kl k X

25、jtCt t n e (14-22) 1 2 11 11 () 12 (,) XX nnn kikik kk X jCtt n e (14-23) n ii X 1 n ii X 1 PILLAI/Cha ),( 11 ctXX )(,),( 22 ctXXctXX nn 12 for wide sense stationarity, we obtain strict sense stationarity as well. From (14-12)-(14-13), (refer to Example 14.2), the process in (14-11) is wide-sense st

26、ationary, but not strict-sense stationary. Similarly if X(t) is a zero mean wide sense stationary process in Example 14.1, then in (14-10) reduces to As t1, t2 varies from T to +T, varies from 2T to + 2T. Moreover is a constant over the shaded region in Fig 14.2, whose area is given by and hence the

27、 above integral reduces to ),cos()( 0 tatX PILLAI/Cha 2 z .)(| 2121 22 T T T T z dtdtttRzE XX 21 tt )( XX R )0( dTdTT)2()2( 2 1 )2( 2 1 22 .)1)(|)|2)( 2 2 2 | 2 1 2 2 2 T t TT T t z dRdTR XXXX (14-24) T T T T2 2 t 1 t Fig. 14.2 21 tt 13 Systems with Stochastic Inputs A deterministic system1 transfor

28、ms each input waveform into an output waveform by operating only on the time variable t. Thus a set of realizations at the input corresponding to a process X(t) generates a new set of realizations at the output associated with a new process Y(t). ),( i tX ),(),( ii tXTtY ),(tY Our goal is to study t

29、he output process statistics in terms of the input process statistics and the system function. 1A stochastic system on the other hand operates on both the variables t and . PILLAI/Cha T )(tX )(tY tt ),( i tX ),( i tY Fig. 14.3 14 Deterministic Systems Systems with Memory Time-Invariant systems Linea

30、r systems Linear-Time Invariant (LTI) systems Memoryless Systems )()(tXgtY )()(tXLtY PILLAI/Cha Time-varying systems Fig. 14.3 .)()( )()()( dtXh dXthtY( )h t( )X t LTI system 15 Memoryless Systems: The output Y(t) in this case depends only on the present value of the input X(t). i.e., (14-25) PILLAI

31、/Cha )()(tXgtY Memoryless system Memoryless system Memoryless system Strict-sense stationary input Wide-sense stationary input X(t) stationary Gaussian with )( XX R Strict-sense stationary output. Need not be stationary in any sense. Y(t) stationary,but not Gaussian with (see (14-26). ).()( XXXY RR

32、(see (9-76), Text for a proof.) Fig. 14.4 16 Theorem: If X(t) is a zero mean stationary Gaussian process, and Y(t) = gX(t), where represents a nonlinear memoryless device, then Proof: where are jointly Gaussian random variables, and hence )(g ).( ),()(XgERR XXXY (14-26) 212121 ),()( )()()()()( 21 dx

33、dxxxfxgx tXgtXEtYtXER XX XY (14-27) )( ),( 21 tXXtXX PILLAI/Cha *1 1 2 /2 12 1212 * * 1 2| | (0) ( ) ( ) (0) (,) (,) , (,) XXXX XXXX X X x Ax TT A RR RR fx xe XXXxx x AEX XLL 17 where L is an upper triangular factor matrix with positive diagonal entries. i.e., Consider the transformation so that and

34、 hence Z1, Z2 are zero mean independent Gaussian random variables. Also and hence The Jacobaian of the transformation is given by . 0 22 1211 l ll L IALLLXXELZZE 11 *1*1 * * 1*122 12 .x A xz L A Lzz zzz 22222121111 , zlxzlzlxzLx PILLAI/Cha 11 1212 (, ) , ( , ) TT ZL XZZzL xzz 18 Hence substituting t

35、hese into (14-27), we obtain where This gives .| 2/11 ALJ 22 12 1/2 111122222 11122212 12 12222212 12 /2/2 11 | | 2 | | 12 12 ( )() () ()( )() ()( )() XYJ A zz zz zz Rl zl z g l zee lz g l zfzfz dz dz lz g l zfzfz dz dz 2 2 2 12 22 22 11112222 12 1222222 2 /2 2 2 12 2 1 2 /2 1 2 ( )()() ()() ( ), zz

36、 z z ull l lz fz dzg l zfz dz lz g l zfzdz e ug uedu 0 PILLAI/Cha 222. ul z 19 2 22 2 22 22 2 2 ( ) /2 1 12 22 2 ( ) ( ) ( )( ) ( )( )( ), u XY u u XX fu u dfu fu du u l l u l Rl lg uedu Rg u fu du Hence).( gives since 2212 * XX RllLLA the desired result, where Thus if the input to a memoryless devi

37、ce is stationary Gaussian, the cross correlation function between the input and the output is proportional to the input autocorrelation function. PILLAI/Cha ),()()( )()(| )()()()( XXXX XXXY RXgER duufugufugRR uu 0 ).(XgE 20 Linear Systems: represents a linear system if Let represent the output of a

38、linear system. Time-Invariant System: represents a time-invariant system if i.e., shift in the input results in the same shift in the output also. If satisfies both (14-28) and (14-30), then it corresponds to a linear time-invariant (LTI) system. LTI systems can be uniquely represented in terms of t

39、heir output to a delta function L )()(tXLtY ).()()()( 22112211 tXLatXLatXatXaL(14-28) L )()()()( 00 ttYttXLtXLtY (14-29) (14-30) L PILLAI/Cha LTI)(t)(th Impulse Impulse response of the system t )(th Impulse response Fig. 14.5 21 Eq. (14-31) follows by expressing X(t) as and applying (14-28) and (14-

40、30) to Thus).()(tXLtY )()()(dtXtX (14-31) (14-32) (14-33) PILLAI/Cha .)()()()( )()( )()( )()()()( dtXhdthX dtLX dtXL dtXLtXLtY By Linearity By Time-invariance then LTI )()( )()()( dtXh dXthtY arbitrary input t )(tX t )(tY Fig. 14.6 )(tX)(tY 22 Output Statistics: Using (14-33), the mean of the output

41、 process is given by Similarly the cross-correlation function between the input and output processes is given by Finally the output autocorrelation function is given by ).()()()( )()()()( thtdth dthXEtYEt XX Y (14-34) ).(),( )(),( )()()( )()()( )()(),( 2 * 21 * 21 * 21 * 21 2 * 121 thttR dhttR dhtXt

42、XE dhtXtXE tYtXEttR XX XX XY * * (14-35) PILLAI/Cha 23 or ),(),( )(),( )()()( )( )()( )()(),( 121 21 21 2 * 1 2 * 121 thttR dhttR dhtYtXE tYdhtXE tYtYEttR XY XY YY * ).()(),(),( 12 * 2121 ththttRttR XXYY (14-36) (14-37) PILLAI/Cha h(t) )(t X )(t Y h*(t2)h(t1) ),( 21 ttRXY ),( 21 ttRYY),( 21 ttRXX (a

43、) (b) Fig. 14.7 24 In particular if X(t) is wide-sense stationary, then we have so that from (14-34) Also so that (14-35) reduces to Thus X(t) and Y(t) are jointly w.s.s. Further, from (14-36), the output autocorrelation simplifies to From (14-37), we obtain XX t)( constant.a cdht XXY ,)()( (14-38)

44、)(),( 2121 ttRttR XXXX (14-39) ).()()( ,)()(),( 21 2121 YYXY XYYY RhR ttdhttRttR (14-40) ).()()()( * hhRR XXYY (14-41) PILLAI/Cha . ),()()( )()(),( 21 * * 2121 ttRhR dhttRttR XYXX XXXY 25 From (14-38)-(14-40), the output process is also wide-sense stationary. This gives rise to the following represe

45、ntation PILLAI/Cha LTI system h(t) Linear system wide-sense stationary process strict-sense stationary process Gaussian process (also stationary) wide-sense stationary process. strict-sense stationary process (see Text for proof ) Gaussian process (also stationary) )(tX )(tY LTI system h(t) )(tX )(t

46、X )(tY )(tY (a) (b) (c) Fig. 14.8 26 White Noise Process: W(t) is said to be a white noise process if i.e., EW(t1) W*(t2) = 0 unless t1 = t2. W(t) is said to be wide-sense stationary (w.s.s) white noise if EW(t) = constant, and If W(t) is also a Gaussian process (white Gaussian process), then all of

47、 its samples are independent random variables (why?). For w.s.s. white noise input W(t), we have ),()(),( 21121 tttqttRWW(14-42) ).()(),( 2121 qttqttRWW(14-43) White noise W(t) LTI h(t) Colored noise ( )( )( )N th tW t PILLAI/Cha Fig. 14.9 27 and where Thus the output of a white noise process throug

48、h an LTI system represents a (colored) noise process. Note: White noise need not be Gaussian. “White” and “Gaussian” are two different concepts! )()()( )()()()( * * qhqh hhqRnn (14-45) .)()()()()( * dhhhh (14-46) PILLAI/Cha (14-44) ( )( ), W E N thd a constant 28 Upcrossings and Downcrossings of a s

49、tationary Gaussian process: Consider a zero mean stationary Gaussian process X(t) with autocorrelation function An upcrossing over the mean value occurs whenever the realization X(t) passes through zero with positive slope. Let represent the probability of such an upcrossing in the interval We wish

50、to determine Since X(t) is a stationary Gaussian process, its derivative process is also zero mean stationary Gaussian with autocorrelation function (see (9-101)-(9-106), Text). Further X(t) and are jointly Gaussian stationary processes, and since (see (9-106), Text) ).( XX R t ). ,(ttt . Fig. 14.10

51、 )(t X )()( XXXX RR )(t X , )( )( d dR R XX XX PILLAI/Cha Upcrossings t )(tX Downcrossing 29 we have which for gives i.e., the jointly Gaussian zero mean random variables are uncorrelated and hence independent with variances respectively. Thus To determine the probability of upcrossing rate, 0 )( )(

52、 )( )( )( XX XXXX XX R d dR d dR R (14-48) (14-47) (0)0 ( )( )0 XX RE X t X t )( and )( 21 tXXtXX(14-49) , 0 )0( )0( and )0( 2 2 2 1 XXXXXX RRR(14-50) 22 11 22 12 1 2 1212 12 221 (,)()(). 2 X XXX xx fx xfxfxe (14-51) PILLAI/Cha 30 PILLAI/Cha we argue as follows: In an interval the realization moves

53、from X(t) = X1 to and hence the realization intersects with the zero level somewhere in that interval if i.e., Hence the probability of upcrossing in is given by Differentiating both sides of (14-53) with respect to we get and letting Eq. (14-54) reduce to ), ,(ttt ,)()()( 21 tXXttXtXttX 12 .XXt (14

54、-52) ) ,(ttt (14-53) t )(tX )(tX )(ttX t tt Fig. 14.11 .)()( ),( 1 12 0 2 0 21 0 21 2 12 221 21 xdxfxdxf dxxdxxft tx xtxx XX XX , t (14-54) 21 2222 0 ()() XX fx x fxt dx , 0t 1212 0, 0, and ()0 XXX ttXXt 31 PILLAI/Cha where we have made use of (5-78), Text. There is an equal probability for downcros

55、sings, and hence the total probability for crossing the zero line in an interval equals where It follows that in a long interval T, there will be approximately crossings of the mean value. If is large, then the autocorrelation function decays more rapidly as moves away from zero, implying a large ra

56、ndom variation around the origin (mean value) for X(t), and the likelihood of zero crossings should increase with increase in agreeing with (14-56). )0( )0( 2 1 )/2( 2 1 )0(2 1 )( )0(2 1 )0()( 2 0 222 0 222 XX XX XX X XX XX R R R dxxfx R dxfxfx (14-55) ) ,(ttt, 0 t . 0 )0(/ )0( 1 0 XXXX RR (14-56) T

57、 0 )0( XX R )( XX R (0), XX R 32 Discrete Time Stochastic Processes: A discrete time stochastic process Xn = X(nT) is a sequence of random variables. The mean, autocorrelation and auto-covariance functions of a discrete-time process are gives by and respectively. As before strict sense stationarity

58、and wide-sense stationarity definitions apply here also. For example, X(nT) is wide sense stationary if and )()(),( )( 2 * 121 TnXTnXEnnR nTXE n * 2121 21 ),(),( nn nnRnnC (14-57) (14-58) (14-59) constanta nTXE ,)(14-60) PILLAI/Cha (14-61) * () ( ) ( ) nn E Xkn T Xk TR nrr 33 i.e., R(n1, n2) = R(n1

59、n2) = R*(n2 n1). The positive-definite property of the autocorrelation sequence in (14-8) can be expressed in terms of certain Hermitian-Toeplitz matrices as follows: Theorem: A sequence forms an autocorrelation sequence of a wide sense stationary stochastic process if and only if every Hermitian-To

60、eplitz matrix Tn given by is non-negative (positive) definite for Proof: Let represent an arbitrary constant vector. Then from (14-62), since the Toeplitz character gives Using (14-61), Eq. (14-63) reduces to n r 0, 1, 2, .n * 0 * 1 * 1 * 110 * 1 210 n nn n n n T rrrr rrrr rrrr T T n aaaa , , , 10 (

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

最新文檔

評(píng)論

0/150

提交評(píng)論