All the code is shown here.
Linear regression with multiple variables
Ok, but when you have multiple variables ? How do we work with them using linear regression ? That comes the linear regression with multiple variables. Let's see an example:
- Subtract the mean value of each feature from the dataset.
- After subtracting the mean, additionally scale (divide) the feature values by their respective “standard deviations.”
215810.61679138, 61446.18781361, 20070.13313796
Predicted price of a 1650 sq-ft, 3 br house: 183865.197988
The goal of regression is to determine the values of the ß parameters that minimize the sum of the squared residual values (difference betwen predicted and the observed) for the set of observations. Since linear regression is restricted to fiting linear (straight line/plane) functions to data, it's not adequate to real-world data as more general techniques such as neural networks which can model non-linear functions. But linear regression has some interesting advantages:
- Linear regression is the most widely used method, and it is well understood.
- Training a linear regression model is usually much faster than methods such as neural networks.
- Linear regression models are simple and require minimum memory to implement, so they work well on embedded controllers that have limited memory space.
- By examining the magnitude and sign of the regression coefficients (β) you can infer how predictor variables affect the target outcome.
- It's is one of the simplest algorithms and available in several packages, even Microsoft Excel!