Linear regression is a predictive analytic technique, which is a relationship between one dependent variable and one or more independent variables.
You might hear about the equation below if you passed from school. This is a simple equation that helps us to find the best price of house or stock or maybe the validity period of the customer, very easy right?
Y = mX + c or Y = mX + b m = Slope = sum[(x - mean(x))(y - mean(y))] / sum (x - mean(x))2 (or) m = Change in X / Change in Y c or b = Y - mX
let me help you to break it and understand this linear regression equation. I believe in that, after reading this post you definitely get the linear regression concept very fun way.
There are multiple types of regression techniques, we often use these two regressions.
- Simple linear regression
- Multivariable linear regression
Y = mX + b - Simple Linear Regression Y = m1X1 + m2X2 + m3X3 +......mnXn + b - Multi linear Regression.
The difference between simple linear regression and multivariable linear regression is the number of variables in a data set that is used to prediction. Read more.
The math behind the simple linear regression
lets us consider 2 points, (5,10) and (7,15). This is as (x1,y1) and (x2,y2). whenever we want to draw a straight line between the points, we need slope and intercept.
Ok, we have X’s and Y’s. so let’s plot the points.
Now calculate the slope of these 2 points.
m = (x2 - x1) / (y2 - y1) = 15 - 10 / 7 -5 = 5 / 2 = 2.5 m = 2.5
Now calculate the Intercept for the line. Intercept is simply a point where the line touches the y-axis. if x =0 then y = b or c.
c = y - mx = 12.5 - (2.5)(6) c = 12.5 - 15.0 c = -2.5 Here, we take x as x_mean and y as y_mean. The above substituted values represents x_mean and y_mean.
Now if you submit the X, m, and values in equation y = mX + b, you will get the Y values which are the same as what we have taken.
finally, we have the slope, X, Y, and intercept. let’s plot it. Based on these values we can predict for the new X points. let’s predict a new Y value for 8.
Here, X = 8 (New point) m = 2.5 c = -2.5 y = mx + b => y = 2.5(8) + (-2.5) => y = 17.5
Do you get that what we have done up to now? we just predicted a new value for new data points like our linear regression model without writing a single line of code. Now we can easily draw a line that covers all points.
This is how we solve a simple linear regression problem and predicting outputs for new data points. In this case, our loss is 0, because our line crosses all points.
But in general, we have a large dataset that consists of multiple data points, so we can’t use hands use there, we must use our lovely computers and our beautiful brains to work smart.
let’s look at this picture below, can you draw a line that covers all the points.?
of course not! but we can draw a line that at least covers some of the data points. Can you help me to find which is the best fit line for these data points?
so, in this case, we need our computers and brains to find the best fit line.
Best fit Line: It is a line that minimizes the error rate or cost function or loss.
We calculated the loss for the above 3 lines, and decide the best fit line which gives low cost or loss.
Note: In real-world X is our features or variables or independent variables and Y is our dependent variable.
For linear regression, we have model evaluation techniques. Read more: Model evaluation techniques for Regression.
I hope you got this simple math behind the simple linear regression. In the same way, we will calculate for multilinear regression also.
Check our more algorithms here: Machine learning algorithms