Data

The GAM of Furrycat’s data shows a strong relationship between mind and both intellect and hardiness.

model <- gam(
  formula = mind ~
    s(hardiness) +
    s(fortitude) + 
    s(dexterity) + 
    s(endurance) +
    s(intellect) + 
    s(cleverness) + 
    s(courage) + 
    s(dependability) +
    s(power) +
    s(fierceness) +
    armor,
  family = gaussian(),
  data = normalized_df
)
## 
## Family: gaussian 
## Link function: identity 
## 
## Formula:
## mind ~ s(hardiness) + s(fortitude) + s(dexterity) + s(endurance) + 
##     s(intellect) + s(cleverness) + s(courage) + s(dependability) + 
##     s(power) + s(fierceness) + armor
## 
## Parametric coefficients:
##              Estimate Std. Error   t value Pr(>|t|)    
## (Intercept) 5011.4388     0.4553 11006.243   <2e-16 ***
## armor          0.8563     1.7452     0.491    0.624    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Approximate significance of smooth terms:
##                    edf Ref.df         F p-value    
## s(hardiness)     1.000  1.000 4.135e+05  <2e-16 ***
## s(fortitude)     5.253  6.035 1.703e+00  0.1225    
## s(dexterity)     2.364  3.042 2.229e+00  0.0856 .  
## s(endurance)     1.000  1.000 7.350e-01  0.3919    
## s(intellect)     1.000  1.000 1.017e+07  <2e-16 ***
## s(cleverness)    2.908  3.467 1.079e+00  0.3780    
## s(courage)       2.937  3.565 1.404e+00  0.2747    
## s(dependability) 1.000  1.000 3.220e-01  0.5708    
## s(power)         1.000  1.000 6.283e+00  0.0126 *  
## s(fierceness)    1.000  1.000 8.900e-02  0.7657    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## R-sq.(adj) =      1   Deviance explained =  100%
## GCV = 26.894  Scale est. = 25.334    n = 370

And the GAM shows linear relationships.

plot(model, select = 5)

plot(model, select = 1)

The linear model shows high correlation and low residuals.

model <- lm(mind ~ intellect + hardiness, data = normalized_df)
summary(model)
## 
## Call:
## lm(formula = mind ~ intellect + hardiness, data = normalized_df)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -8.7027 -4.3901 -0.7506  4.5294  8.4374 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 41.688608   0.708125   58.87   <2e-16 ***
## intellect   14.955890   0.001742 8587.93   <2e-16 ***
## hardiness    2.993651   0.001821 1643.66   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.139 on 367 degrees of freedom
## Multiple R-squared:      1,  Adjusted R-squared:      1 
## F-statistic: 7.177e+07 on 2 and 367 DF,  p-value: < 2.2e-16

And looks like this.

The residuals suggest that some of the error may be due to rounding.

Conclusion

Mind is roughly captured by the following formula, \(mind \approx 42 + 15 * intellect + 3 * hardiness\).