Brief Introduction to Linear regression

Kranthi kumar
2 min readMay 6, 2020

Here I am giving you a brief insight into linear regression without going to the actual math behind it.

Before going to know about linear regression, we need to know about supervised learning

Supervised learning -

Which is an algorithm consist of target or outcome variable(dependent variable).That target variable is to be predicted from given independent variables(Predictors).With the help of these variables, we generate a function that maps to desired outputs. We need to train this process until we get the desired level of accuracy on training data.

So, the algorithm modifies according to the pattern it perceives in the input and output received

This algorithm is usually applies in cases involving historical data.Like,using historical data of credit card transactions, it can predict the future possibilities of faulty or fraudulent card transactions.

Regression, Decision Tree, Random forest, KNN,logistic Regression etc are the examples of supervised learning algorithms

Linear regression

  • This is a supervised learning algorithm
  • It is used to estimate values based on continuous variables like the number of calls, sales, etc.
  • In this algorithm, we establish a linear relationship between independent and dependent variables by fitting the best line(which is also known as regression linen Y= (a*X + b))
  • We have data set x and corresponding target value y.
  • Linear regression is a parametric method- which means it makes an assumption about the form of the function relating to X and Y
  • Our model will be a function that predicts ŷ given a specific x:
  • In 2 dimensional case ŷ= β0 + β1*x+ error
  • β0 is the y-intercept and β1 is the slope of our line. Our goal is to learn the model parameters (in this case, β0 and β1) that minimize error in the model’s predictions. errors/noise
  • To find the values of parameters we need to know the Cost function(which is also known as loss function which measures how inaccurate our model predictions are). So for that, we need to select β such that which minimizes cost function(or loss function).In 2d it is bet fit line in 3d it is best-fit plane
  • For minimizing we need to use a gradient descent Algorithm.

--

--