Estimation
of Regression Parameters
by
OLS Method
In previous lectures, we discussed regression,
estimating parameters from numerical facts, and goodness of fit. In this
lecture, we will look at how these formulas can be produced by the OLS method, as
well as their characteristics.
Ordinary Least Squares Approach
The ordinary least squares (OLS) approach is a
linear regression methodology for estimating unknown parameters in a model. The
method is based on minimizing the sum of squared residuals between
the observed (observed values of the response variable) and predicted (model) values
of the response variable.
Consider the simple linear regression model:
If the regression model for an activity or phenomenon
is successfully and correctly defined, the deviation or error component is
zero. Using this assumption, the predicted model will be:
The sum of squares residual is defined as:
According to the principle of least squares, we
determine the values of intercept
To estimate
To estimate
Consider the simple linear Regression
Model
The estimated model is given by:
Where:
Express in deviated form:
i. The OLS estimate is a linear function of
the response variable.
Hence, slope is a linear function of the response variable.
Where:
Intercept is a linear function of the response variable.
Where:
iii. The variance of slope is given by:
Now the variance of the OLS estimate of intercept is defined as:
iv. The covariance of intercept and slope is defined as:
- Read More: Features of OLS Estimates




















Very informative and interesting.
ReplyDelete