In the previous edition of Making Statistical Concepts Accessible, we introduced the Multiple Linear Model, focusing on its goal and conceptual aspects.

You can review it via the following link:

Making Statistical Concepts Accessible – Edition 6

We highlighted that linear regression serves two primary purposes: making predictions and drawing inferences. In both cases, estimating the regression coefficients, often referred to as the “betas”, is essential.

But how are these “betas” calculated, and what do they represent?
These are central questions when conducting a practical analysis involving linear models. Software programs such as R, Python, SAS, and SPSS focus on estimating these coefficients to enable further inference or prediction.

Under ideal conditions, where appropriate assumptions are met, the Ordinary Least Squares (OLS) method is used to estimate the betas. Interestingly in this case, OLS provides the same results as Maximum Likelihood Estimation (MLE), which is a usual method in statistical theory.

The goal of this edition is to explore the necessary and usual assumptions for OLS to perform effectively and to discuss alternatives when these assumptions are violated.

Assumptions for Multiple Linear Regression and Ordinary Least Squares (OLS)

To ensure reliable and interpretable results, the following assumptions must be met:

Complete Article on LinkedIn

The full article is available at the following link:

Read the full article here

We welcome your comments and questions, and invite you to follow us for more insights.