I continue working with machine learning algorithms. In a previous post I talked about linear regression with one variable and I described different algorithms to predict hypothesis.
In this case, I’m playing with linear regression but, with some features. Linear regression only have one input feature and one output feature. For example, you can predict the price of a house give the house’s size. But imagine that you want predict the price of a house using size and rooms features. When you have more than one input feature is called ‘multi-variable linear regression’.
In the following figure we can see the two input features (size and rooms), the training data (red dots), and the predictions (blue dots). In this case, we can represent the information with a 3D model. If your model have more than three features you must research the way to represent all the data.
For a house with 70m and 2 rooms, we predict a price of 143496.23€ For a house with 100m and 3 rooms, we predict a price of 203095.22€ For a house with 175m and 4 rooms, we predict a price of 363570.19€
It’s important notice that in this case we are working in 3D space, so the regression model is represented by a plane (instead of line, as one-variable linear regression model). In the following figures you can see how the predictions dots are on the plane. We obtain always a plane regression if we are using three variables and none of them are quadratic or cubic features. Thanks to mplot3d library you can render 3D graphics and move the perspective of the figure to see the regression plane from different points of view.