About Linear Regression
Every Data Scientist starts with this one. So, here it is.
Linear Regression is one of the most widely used Artificial Intelligence algorithms in real-life Machine Learning problems — thanks to its simplicity, interpretability and speed! We shall now understand what’s behind the working of this algorithm in the next few minutes!
What is Linear Regression?
It helps determine the following:
Assumption of Linear Regression:
Types OF Linear Regression:
Simple Linear Regression: Simple Linear Regression helps to find the linear relationship between two continuous variables, One independent and one dependent feature.
The formula can be represented as y=mx+b or
Multilinear Regression: We Often use Multiple Linear Regression to do any kind of predictive analysis as the data we get has more than 1 independent feature to it.
The formula can be represented as Y=mX1+mX2+mX3…+b, OR
How Linear Regression Works:
This line is also called the Least Square Regression Line(LSRL).
Y=mx+c where m=slope, c=intercept
How do you say model Best or not:
If the error is less then it's the Best model otherwise it's a bad model.
How To Calculate Average Error or Total Error:
Recommended by LinkedIn
Mean Squared Error(MSE)
Mean Absolute Error(MAE)
Root Mean square Error(RMSE)
“Important: MSE, MAE, RMSE are the error function which tells about a total error made by the model”
“ Model is best only if Total error is less”
How to evaluate the Model?/How to measure the performance of the model?
R-squared /R2-score/R^2 - score:
Disadvantages of R^2 :
Increases as the number of independent variables increases which has very less relationship with the target variable to overcome the above issue we use Adjusted R^2
Adjusted R^2:
Will measure the performance of the model by ignoring columns that have very less relationship with the target.
NOTE: “adjusted R squared <r squared then we say it Good Model.”
Linear Regression with Gradient Descent:
The best way to define the local minimum or local maximum of a function using gradient descent is as follows:
This entire procedure is known as Gradient Ascent, which is also known as steepest descent. The main objective of using a gradient descent algorithm is to minimize the cost function using iteration. To achieve this goal, it performs two steps iteratively:
How does Gradient Descent work?
Gradient descent starts with a random slope and works iteratively to reach global minima.
I hope this article helped you understand the Algorithm and Most of the concepts related to it.
Coming up next Week, We will Understand the Logistic Regression.
HAPPY LEARNING!!!!!
Like my article? Do give me a clap and share it, as that will boost my confidence. Also, I post new articles every Sunday so stay connected for future articles on the basics of data science and machine learning series.
Also, do connect with me on
For Model Building , Do connect me on GitHub
_Thank_You_