ERROR In VARIABLES

ERROR IN VARIABLES

The basic assumption of all statistical studies is that the data on the variables are accurately or correctly measured. In regression and econometric model, the assumption is that both the independent and dependent variables are measured without error and the whole deviation is attributed to the error term. If this fundamental assumption is violated and the dependent and independent variable have real part and error part, then it is called error in variable.

Consider the simple linear regression model without intercept.

Y = α + βX ϵ

Where:

   E ( ϵ ) = 0   and     Variance ( ϵ ) = 0

Now, if the dependent and independent variables are not free from error of measurement and both variables will have a real part and error part. Let the independent and dependent variables may take the value X = x + u and Y = y + v . Where u and v are the error portions in the independent and dependent variables. The discrepancy between observed and true values of the variables is refers to error in variables. The classical assumption of error in variables, the part is independent of the error part.

That’s

E (x u) = 0      and E (y v ) = 0

The covariance between observed and their respective error part is equal to the variance of their respective error term.

The model involve error in variables

y = α β x + ϵ

Where:

Y = y + v     and     X = x + u

                           Y - v = α β (X - u ) + ϵ

                          Y  = α β X  + ( ϵ + v - β  u )

                          Y  = α β X  + w

where:

w = ( ϵ + v - β  u )

The new develop error term “w” is dependent on X and the OLS assumption is violated. The OLS method cannot be applied to estimate the parameters of the model. Thus, it is established if variables are measured with error, the OLS method can’t be employed.

If the variables are measured with error, let make the following assumptions.

 1.     E ( u ) = 0 and Var ( u ) σu2

 2.      E ( v ) = 0 and Var ( v ) σv2

 3.      P limit n ( x, v) 0

 4.      P limit n ( y, u) 0

 4.    P limit n ∑(X - X−)2 =σx2

 5.  P limit n ∑(Y - Y−)2 =σy2

E ( w )  =  E ( ϵ + v - β  u )

E ( w )  =  E ( ϵ )  + E ( v )  -  β E ( u )

E ( w )  =  0


Consequences of Dependent Variable measured with Error

Case – 1: Error is measure in the dependent Variable that’s  Y = y + v and X = x + 0

Consider the SLRM of ignoring intercept ,

                                    y =  βx ϵ 

                                    Y - v =  βX ϵ 

                                   Y  =  βX ϵ + v

                                    =  βX + w

where:

 w = ϵ + v

                                  E ( w ) = E (  ϵ ) + E ( v )

                                  E ( w ) = 0

The covariance of X and w:

Cov ( X, w ) = E ( X w) - E ( X ) E ( w )

Cov ( X, w ) = E ( X w) 

Cov ( X, w ) = E ( X ( ϵ + v) 

                                                          Cov ( X, w ) = E ( X  ϵ) + E ( X v)  

                                                          Cov ( X, w ) = 0

In this model w and X are independent and we can assume that the covariance ( X , w ) = 0 . The OLS method can be employed to estimate the parameters of the model.

The OLS estimate is given by;


i.                    The OLS estimate is unbiased if the dependent variable is measured with error.

                                                        E (β^ols) = β

The OLS estimate of  is given below:

ii.                    The variance of OLS estimate is inflated if dependent variable is measured with error.

The variance of OLS estimate is given by;

The variance of OLS estimate is inflated.

iii.                    The OLS estimate is consistent if dependent variable is measured with error.

An estimate is said to be consistent, if 




Case – 2: Error is measure in the independent Variable that’s X = x + u and Y = y + 0

Consider the model

y = β x + ϵ

Y - 0 = β ( X - u ) + ϵ

= β X + ϵ - β u 

= β X + w
 
where: w = ϵ - β u

E ( w ) = E ( ϵ - β u )

E ( w ) = E ( ϵ )  - β E ( u )

E ( w ) = 0

Here we check that the covariance of X and w are independent or not.

Now to determine the amount of covariance as:

Cov ( X, w) = E { (X - E (X) ( w - E (w ) }

Cov ( X, w) = E { ((x + u - E (x + u)) w  }

Cov ( X, w) = E { ((x + u - E (x) - E (u) ) (w)  }

Cov ( X, w) = E { ((x + u - x ) ) (w)  }

Cov ( X, w) = E { u  w  }

Cov ( X, w) = E { u  ϵ - β u^2  }

Cov ( X, w) = E ( u  ϵ ) - β  E (u^2 )

Cov ( X, w) = β  E (u^2 )

Cov ( X, w) = β  σ2u

In this model w =  ϵ - β u  is not independent of X. Thus, Cov ( X, w) is not zero.

Which violate the classical assumption.

Consequences when independent variable is measured with error



The OLS estimate is biased when independent variable is measured with error.

ii.                    The variance of OLS estimate is inflated

iii.                    The OLS estimate is inconsistent if independent variable is measured with error that

 Y = y and X = x + u

The OLS  estimate of β is


Case – 3: Error is measure in the independent Variable and dependent variables that’s  X = x + u and Y = y + v

Consider the model


y = β x + ϵ

Y - v = β ( X - u ) + ϵ

= β X + ϵ + v - β u 

= β X + w
 
where: w = ϵ + v - β u

E ( w ) = E ( ϵ + v - β u )

E ( w ) = E ( ϵ ) + E ( v )  - β E ( u )

E ( w ) = 0

In classical assumption it is assumed that the predictor is independent of the model disturbance term. here we checking that this assumption is hold or not.

 Cov ( X , w ) = E [{X - E (X)} { w - E ( w )}]

 Cov ( X , w ) = E [{x - u  - E (x - u )} { w }]

                                                 Cov ( X , w ) = E [{x - u  - E (x) - E ( u )} { w }]


 Cov ( X , w ) = E [{x - u  - x - } { w }]

Cov ( X , w ) = E [{  - u   } { w }]

Cov ( X , w ) = E [ - u {ϵ + v - β u}]

Cov ( X , w ) = E [ - u ϵ - u v + β u^2]

Cov ( X , w ) = β E [ u^2]

Cov ( X , w ) = β σu2

The classical assumption that the predictor is independent of the model disturbance term is not hold.

Consequences when independent & dependent variables are measured with error



i.                    The variance of OLS estimator is increase when independent and dependent variables are measured with error.



ii.                    The OLS estimator is inconsistent when X and Y are measured with error.






No comments:

Post a Comment

Moving Average Models (MA Models) Lecture 17

  Moving Average Models  (MA Models)  Lecture 17 The autoregressive model in which the current value 'yt' of the dependent variable ...