Ergodic Stochastic Process
(Ergodicity)
Lecture 13
An ergodic process is defined as a random stationary
process in which the statistical characteristics, like mean and variance, etc.,
may be estimated using the time averages of sample functions. The ergodic is a
technique to simplify the problem or stochastic process. The statistical
characteristics can be deduced from a single random or single realisation.
A weak stochastic process is said to be ergodic if it satisfies
i. The time average is equal to the mean ensemble average.
Let Xt
The stochastic process has statistical properties and
does not change over time. The stochastic process can be easy to analyse and
forecast.
MCQs
1. The stochastic process on this S & T is
divided into
|
a |
2 |
b |
3 |
c |
4 |
d |
6 |
2. The blood pressure of a patient at time t is an
example of
|
a |
Discrete
Parameter Discrete Process |
b |
Continuous
Parameters Continuous Process |
c |
Continuous
Parameter Discrete Process |
d |
Discrete
Parameter Continuous Process |
3. If the joint distribution of X(t1), X(t2), ..., X(tn) is
equal to the joint distribution of X(t1+τ),
|
a |
Stationary
|
b |
Strong
Stationary |
c |
Weak
Stationary |
d |
Ergodic |
4. A stochastic process that is stationary for first- and second-order distribution functions is called:
|
a |
Stationary |
b |
Strong
Stationary |
c |
Weak
Stationary |
d |
Ergodic |
|
a |
Time
invariant |
b |
Constant
|
c |
Variable
|
d |
Both
(a) & (b) |
|
a |
Strong
Stationary |
b |
Weak
Stationary |
c |
Non-Stationary |
d |
Ergodic |
|
a |
Non-Stationary |
b |
Weak
Stationary |
c |
Ergodic |
d |
None
of them |
Let Z(t) be
a sequence of independent normal variables with mean zero and unit variance, and
let a and b be constants. Which of the following functions is stationary? For
each stationary process, specify the mean and auto-covariance.
X(t) = a + bZ(t) + cZ(t-2)
Solution: Z(t) is an SNV.
E[Z(t)] = 0
Var[Z(t)] = 0
E[X(t)] = E[a + bZ(t) + cZ(t-2)]
The mean is time invariant.
We know that
at τ = 0
Cov{Z(t) Z(t)} = E [{Z(t) - E(Z(t))} {Z(t) - E(Z(t))}]
Cov {Z(t) Z(t)} = E [{Z(t) - 0} {Z(t) - 0}]
Cov {Z(t) Z(t)} = E [Z(t) Z(t)]
Cov {Z(t) Z(t)} = Var (Z(t))
Cov[Z(t-2), Z(t+τ)] = E[{Z(t-2) - E(Z(t-2))} {Z(t+τ) - E(Z(t+τ))}]
at τ = -2
Cov[Z(t-2), Z(t+τ)] = E[{Z(t-2) - E(Z(t-2))} {Z(t - 2) - E(Z(t - 2))}]
Cov[Z(t-2), Z(t+τ)] = E[{Z(t-2)} {Z(t - 2)}]
Cov[Z(t-2), Z(t+τ)] = Var(Z(t-2))
Cov[Z(t-2), Z(t+τ)] =
Cov[Z(t) Z(t-2+τ)] = E[{Z(t) - E(Z(t))} {Z(t-2+τ) - E(Z(t-2+τ)}]
at τ = 2
Cov[Z(t) Z(t-2+τ)] = E [{Z(t) - E(Z(t))} {Z(t-2+2) - E[Z(t-2+2)}]
Cov[Z(t) Z(t-2+τ)] = E[{Z(t)} {Z(t)}]
Cov[Z(t), Z(t-2+τ)] = Var[Z(t)]
Cov[Z(t), Z(t-2+τ)] =
Equation 1 becomes
The covariance depends on the gap of time but not on time.
X(t) = a + bZ(t) + cZ(t-2) is WSS
- Read More:Autocorrelation Function








No comments:
Post a Comment