Linear Regression in R
Overview
The following how-to guides explore how to conduct correlation and regression analyses in R. Focus is placed on how to implement these analyses in R rather than providing a conceptual background. For an introductory background, I suggest Field, Miles, and Field’s 1st edition of “Discovering Statistics Using R”. The Field et al. text is an excellent introduction to many different types of statistical analyses and how to conduct them in R. I will also use sample datasets from the Field et al. text to demonstrate these analyses in R.
Requirements
- Working installation of RStudio & R
- Packages:
- tidyverse
- rstatix
- broom
- car
- QuantPsyc
Topics covered
Linear Regression Pt. 1 - Simple Linear Regression
This guide is the first part in a series on linear regression. I will walk through how to build a simple linear regression model with one outcome variable and one predictor variable.
Linear Regression Pt. 2 - Multiple Linear Regression
In the previous guide, we built a simple linear regression model to predict salary from age in a sample of super models. In this second part, we continue to build a more complex model to predict salary from age, and other variables including years of experience, and a rating of attractiveness.
Linear Regression Pt. 3 - Casewise Diagnostics
After building our simple and multiple regression model, we turn our attention to casewise diagnostics to learn about which outliers are present in the sample and which data points have undue influence on our model which could affect the models stability.
Linear Regression Pt. 4 - Model Assumptions
We continue the series on regression by exploring the assumptions of our model. We focus on the assumptions of independence of residuals, multicolinearity, and normality of residuals. Assumption of independent errors: Durbin-Watson test We can test the assumption of independent errors with the Durbin-Watson test.