Super learner
This user is a Super Learner. To become a Super Learner, you need to reach Level 8.
Last answered:

15 Sept 2022

Posted on:

14 Sept 2022

1

Why is adjusted R-squared always smaller than R-squared?

Good afternoon. By definition, if I've well understood, adjusted R^2 is R^2 for multiple regressions and tends to be closer to 1 because with multiple regressions the explanatory power may only increase or stay the same. So why at 0:52 you say that adjusted R^2 is always smaller than R^2? If it's closer to 1, shouldn't it be greater than the previous R^2? Thank you

3 answers ( 0 marked as helpful)
Posted on:

15 Sept 2022

0

Hi Alessandro,
Good to hear from you. The idea of adjusted R^2 is that it penalizes you for adding additional independent variables that do not contribute with a high predictor score. Thus, if you use only 1 independent variable R^2 and adjusted R^2 are equal. When more variables are added adjusted R^2 is either equal or smaller than R^2.
Best,
Ned

Super learner
This user is a Super Learner. To become a Super Learner, you need to reach Level 8.
Posted on:

15 Sept 2022

0

But why adjusted R^2 with more variables can be equal or smaller than R^2 and not equal or bigger? Around minute 0:45 I understand the ladder, while at 0:53 the video says the opposite

Super learner
This user is a Super Learner. To become a Super Learner, you need to reach Level 8.
Posted on:

15 Sept 2022

1

I think I have understood now. I thought that adjusted R^2 was just the new name of R^2 for multiple variables, but it's a different thing instead. But what is the mathematical formula of adjusted R^2? I just know the one of R^2. Thanks

Submit an answer