ERROR
IN VARIABLES
The basic assumption of all statistical studies is
that the data on the variables are accurately or correctly measured. In
regression and econometric model, the assumption is that both the independent
and dependent variables are measured without error and the whole deviation is
attributed to the error term. If this fundamental assumption is violated and
the dependent and independent variable have real part and error part, then it
is called error in variable.
Consider the simple linear regression model without
intercept.
Now,
if the dependent and independent variables are not free from error of
measurement and both variables will have a real part and error part. Let the independent
and dependent variables may take the value X = x + u and Y = y + v
That’s
E (x u) = 0 and E (y v ) = 0
The covariance between observed and their respective error part is equal to the variance of their respective error term.The model involve
error in variables
Where:
Y = y + v and X = x + u
The new develop error term “w” is dependent on X and the OLS
assumption is violated. The OLS method cannot be applied to estimate the
parameters of the model. Thus, it is established if variables are measured with
error, the OLS method can’t be employed.
If the variables are measured with error, let make the following
assumptions.
Consequences of Dependent Variable measured with Error
Case – 1: Error is measure in the dependent
Variable that’s Y = y + v
Consider the SLRM of ignoring intercept ,
E ( w ) = 0
The covariance of X and w:
Cov ( X, w ) = E ( X w) - E ( X ) E ( w )
Cov ( X, w ) = E ( X w)
Cov ( X, w ) = E ( X ( ϵ + v) )
In this model w and X are independent and we can assume that the covariance ( X , w ) = 0
The OLS estimate is given by;
i.
The OLS estimate is unbiased if the
dependent variable is measured with error.
The
OLS estimate of
ii.
The variance of OLS estimate is inflated
if dependent variable is measured with error.
The variance of OLS estimate is given by;
The variance of OLS
estimate is inflated.
iii.
The OLS estimate is consistent if
dependent variable is measured with error.
An estimate is said to be consistent, if
Case – 2:
Error is measure in the independent Variable that’s X = x + u
Consider the model
Now to determine the amount of covariance as:
Cov ( X, w) = E { (X - E (X) ( w - E (w ) }
Cov ( X, w) = E { ((x + u - E (x + u)) w }
Cov ( X, w) = E { ((x + u - E (x) - E (u) ) (w) }
Cov ( X, w) = E { ((x + u - x ) ) (w) }
In this model w =
Which violate the classical assumption.
Consequences
when independent variable is measured with error
The OLS estimate is biased when independent variable is
measured with error.
ii.
The variance of OLS estimate is inflated
iii.
The OLS estimate is inconsistent if independent
variable is measured with error that
The OLS estimate of β is
Case – 3:
Error is measure in the independent Variable and dependent variables that’s X = x + u
Consider the model
In classical assumption it is assumed that the predictor is independent of the model disturbance term. here we checking that this assumption is hold or not.
Cov ( X , w ) = E [{X - E (X)} { w - E ( w )}]
Cov ( X , w ) = E [{x - u - E (x - u )} { w }]
Cov ( X , w ) = E [{x - u - E (x) - E ( u )} { w }]
Consequences
when independent & dependent variables are measured with error
i.
The variance of OLS estimator is increase when
independent and dependent variables are measured with error.
ii.
The OLS estimator is inconsistent when X and Y are
measured with error.
- Read More:Estimation of Parameters in Errors in Variables
- Read More:Dummy Variables
- Read More:Logistic Regression






















No comments:
Post a Comment