Polynomial Regression is a form of linear regression model but fits a non-linear relationship between the value X and Y. Basically we have to add new features to the final equation. But what features? It’s simple we can add the X1 feature as new feature: X1^2 or X1^3. If we have some input features (X1,X2,X3) also we can add new features as X1*X2 or X1^2*X3^2. So, the polynomial regression model is:
As you can see in the following figure you can get a linear regression (red line) or you can get a non-linear regression (blue and yellow) if you add more features to your equation. The type of curve of non-linear regression model depends of the grade of your polynomial. If the grade of the polynomial is near 1 the model fits to traditional linear regression model. Instead, if the grade of the polynomial is high the model fits to training points.
OverFitting is the term that it’s used to describe when your linear regression model don’t have capacity to predict because the model fits very near of the training points. This fact can happen if the model is very complex or the grade of equation is high. You notice that over-fitting is bad for your model and you should avoid it because the predictions will be very similar or equals to the training points.
You can play in live with this concept about polynomial regression in the following link: http://www.arachnoid.com/polysolve/index.html
Also you can get the source code of this example from my GIT repository.