Processing math: 100%

Introduction to Linear Regression: Cost Function

October 29, 2014

Cost Function (Method I)

For calculating the cost in linear regression, typically we use Sum of Squared Error method(SSE)

 J(θ)=12mmi=1(h(x(i))y(i))2

The goal is to minimize  J(θ) and figure out  θ values corresponding to the minimum cost. There are several optimization algorithms used to achieve this.

Introduction to Linear Regression: Polynomial Regression

October 26, 2014

Polynomial Regression

This form of regression is used to fit more complex functions and this is a general concept not restricted to linear regression, but also used commonly in classification algorithms such as Logistic Regression and Neural Networks as well. I hope to talk about it in detail in a future post. For now take a look at the following scatter plot.

Introduction to Linear Regression: A Machine Learning Approach

October 7, 2014

Supervised Learning is a form of learning in which we use known data with actual outputs from past experiences to model a relationship and this model is used to predict future outcomes. The known data used to build up the model is called 'training data'.

To build a supervised learning model we need,
  1. Training Data
  2. Hypothesis
  3. Cost Function
  4. Optimization method for Minimization


Page 1 of 5123...5