Introduction to Linear Regression: Normal Equation

November 1, 2014

Normal Equation (Method II)

The use of Normal Equation to solve for the optimal \(\ \theta\) and find the hypothesis function is an alternate method to what we talked about in the last post and probably the easiest, but of course they have their own benefits and drawbacks.

Here we don't need to define a cost function, therefore we don't need to select \(\ \alpha \); the learning rate or specify number of iterations. Same as before, X is our feature matrix and for the intercept term \(\ \theta_0\), a column of ones should be added to X before calculation.