Assumptions of Linear Regression Explained :- By Saurabh
1.Linearity
This assumes that there is a linear relationship between the predictors (e.g independent variable) and the response variable(e.g dependent variable).
You can use the scatter plot to detect the linearity of the variables.
2.No Outliers
There should be no outliers in the data.You can check for outliers with the help of box plot
3. No Multicollinearity
There should be no multicollinearity between the independent variables.
4. Autocorrelation
There should be no correlation between the residual (error) terms.Absence of this concept is known as autocorelation.
5. Normality
The dependant varible should be normally distributed.If not you convert it through log function.
It’s not uncommon for assumptions to be violated on real-world data, but it’s important to check them so we can either fix them and/or be aware of the flaws in the model for the presentation of the results or the decision making process.
It is dangerous to make decisions on a model that has violated assumptions because those decisions are effectively being formulated on made-up numbers. Not only that, but it also provides a false sense of security due to trying to be empirical in the decision making process. Empiricism requires due diligence, which is why these assumptions exist and are stated up front.
Short and simple✌️
ReplyDeleteCrisp & informative 👌
ReplyDelete