Effects & Consequences of Multicollinearity

 

Effects of Multicollinearity on OLS Estimate

The occurrence of multicollinearity has a variety of negative consequences for the O L S estimations of the regression coefficient. Following are some points to consider:

Consider the following GLRM:


i.                 1.    In case of perfect multicollinearity, the O L S estimates are indeterminate.


The O L S estimates are indeterminate, which is not a valid situation.

i.                   2.  In case of imperfect multicollinearity, the O L S estimates are sensitive.



This correlation is an issue because independent variables should be independent. If the degree of correlation between variables is strong enough, it can present issues when fitting the model and interpreting the results.

i.                    3. In case of perfect multicollinearity, the variance of O L S estimates is undefined.



This infinite variance is an issue since the marginal error component will be substantially higher, resulting in an extremely wide confidence interval.

i.                   In case of imperfect multicollinearity, the variance of O L S estimates is sensitive.




The variance is affected by the correlation coefficient and becomes sensitive.



Note that if r is close to zero, multicollinearity is not destructive and is referred to as non-harmful multicollinearity. When r is close to +1 or -1, multicollinearity inflates the variance and causes it to skyrocket. This is known as detrimental multicollinearity.

Consequences of Multicollinearity

i.                    In case of perfect multicollinearity:

1. In perfect multicollinearity, the O L S estimates are indeterminate, as is their standard error.



i.                    In case of imperfect multicollinearity

2.The O L S estimates are sensitive in the presence of imperfect multicollinearity, and the standard error of the O L S estimates is quite sensitive. As the correlation between predictor variables increases, so does the standard error, which has a significant impact on inferences based on such estimates.


Due to large standard error the regression coefficients may not appear significant. Consequently, important variables may be dropped.

For example, to test Ho: β = 0. we use t – statistic as





i.                    3. The confidence interval tends to be broader due to the significant standard error. The precision of the interval estimate will be diminished, and the likelihood of type II error Will rise.


i.                    4. In cases of severe multicollinearity, OLS estimates may be sensitive to tiny changes in explanatory variable values. If certain observations are added or removed, the magnitude and sign of the OLS estimate may alter significantly. The OLS estimate should ideally not change when values are added or removed.


No comments:

Post a Comment

Moving Average Models (MA Models) Lecture 17

  Moving Average Models  (MA Models)  Lecture 17 The autoregressive model in which the current value 'yt' of the dependent variable ...