What plots in R did you produce to communicate your linear regression test results
One Predictor and one response variable
If you just have one predictor variable and one response variable, the visualization for the same is pretty easy. For example, take the built in cars data which gives breaking distance to the car’s speed. We are predicting the breaking distance based on the car’s speed.
> model = lm ( dist ~ speed , data = cars)
> model$coefficients
(Intercept) speed
-17.579095 3.932409
Now, let’s plot the data vs the predicted line.
ggplot ( ) +
geom_point ( data = cars, aes(x = speed, y = dist )) +
geom_abline( intercept = model$coefficients[1], slope = model$coefficients[2])

This gives a good visual on how the model fits the data.
Multiple Predictors
What about regression with multiple predictors ? Well, the straight answer is that there is no good way to visualize linear regression model for more than 1 predictor. You can go 3-D, but it is not easy to interpret.
Since visualizing more than 2 predictors is a challenge, we tend to focus more on validation. For example, the plot ( model ) function gives the following validation plots.
model = lm ( mpg ~ . , data = mtcars)
> summary(model)
Call:
lm(formula = mpg ~ ., data = mtcars)
Residuals:
Min 1Q Median 3Q Max
-3.4506 -1.6044 -0.1196 1.2193 4.6271
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 12.30337 18.71788 0.657 0.5181
cyl -0.11144 1.04502 -0.107 0.9161
disp 0.01334 0.01786 0.747 0.4635
hp -0.02148 0.02177 -0.987 0.3350
drat 0.78711 1.63537 0.481 0.6353
wt -3.71530 1.89441 -1.961 0.0633 .
qsec 0.82104 0.73084 1.123 0.2739
vs 0.31776 2.10451 0.151 0.8814
am 2.52023 2.05665 1.225 0.2340
gear 0.65541 1.49326 0.439 0.6652
carb -0.19942 0.82875 -0.241 0.8122
> plot(model)



