Simple Linear Regression

y = mx + c

Regression just puts a line through the observations which is of best-fit. The "best fit line". To find the best fit line:
We find the minimum of the sum of the squared errors of the line. Squared error is the square of the vertical distance between the actual point and the point on the line.

This model aims to predict continuous real numbers, like salary or a temperature.

Importing the libraries

Importing the dataset

Splitting the dataset into the Training set and Test set

Training the Simple Linear Regression model on the Training set

Predicting the Test set results

Visualising the Training set results

Visualising the Test set results

Making a Single Prediction

For eg, the salary of an employee with 12 years of experience

Therefore, our model predicts that the salary of an employee with 12 years of experience is $ 138967,5.

Important note: Notice that the value of the feature (12 years) was input in a double pair of square brackets. That's because the "predict" method always expects a 2D array as the format of its inputs. And putting 12 into a double pair of square brackets makes the input exactly a 2D array. Simply put:

12scalar

[12]1D array

[[12]]2D array

Getting the final linear regression equation with the values of the co-efficients

Therefore, the equation of our simple linear regression model is:

Salary=9345.94×YearsExperience+26816.19

Important Note: To get these coefficients we called the "coef" and "intercept" attributes from our regressor object.