Ordinary Least Square(OLS):
- Non Iterative method to find best fit line such that the sum of squares of diff of Observed and Predicted values is minimized.
Error = (y_pred – y_act)^2
Line => y = bo + b1x
Error = (y_pred – y_act)^2
Line => y = bo + b1x
y_i = Actual Value
- Above formula is for Univariate(one variable)
- For multivariate case, when we have many variables, the formula becomes complicated and requires too much calculation while implementing in software.
- fail for collinear predictors(correlation between features)
- can be run in parallel but its still much complicated and expensive.
Gradient Descent:
- applies to non-linear model as well.
- works well for collinear predictors
- saves lot of time in calculation as it can be run parallely and distribute load across multiple processors.
•Cost Function, J(m,c) = (y_pred – y_act)^2 / No. of data point
•Hypothesis: y_pred = c + mx
•Hypothesis: y_pred = c + mx
Thanks for sharing this info
ReplyDelete