Data

The GAM of Furrycat’s data shows a strong relationship between mind and both intellect and hardiness.

model <- gam(
  formula = mind ~
    s(hardiness) +
    s(fortitude) + 
    s(dexterity) + 
    s(endurance) +
    s(intellect) + 
    s(cleverness) + 
    s(courage) + 
    s(dependability) +
    s(power) +
    s(fierceness) +
    armor,
  family = gaussian(),
  data = normalized_df
)
## 
## Family: gaussian 
## Link function: identity 
## 
## Formula:
## mind ~ s(hardiness) + s(fortitude) + s(dexterity) + s(endurance) + 
##     s(intellect) + s(cleverness) + s(courage) + s(dependability) + 
##     s(power) + s(fierceness) + armor
## 
## Parametric coefficients:
##              Estimate Std. Error   t value Pr(>|t|)    
## (Intercept) 4654.2390     0.4185 11121.679   <2e-16 ***
## armor         -1.6074     1.2453    -1.291    0.198    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Approximate significance of smooth terms:
##                    edf Ref.df         F p-value    
## s(hardiness)     1.000  1.000 3.129e+05  <2e-16 ***
## s(fortitude)     1.000  1.000 1.267e+00  0.2611    
## s(dexterity)     1.000  1.000 2.200e-02  0.8810    
## s(endurance)     1.000  1.000 6.500e-01  0.4204    
## s(intellect)     1.000  1.000 6.481e+06  <2e-16 ***
## s(cleverness)    1.000  1.000 2.100e-02  0.8841    
## s(courage)       1.316  1.569 2.260e-01  0.7964    
## s(dependability) 1.000  1.000 4.320e-01  0.5112    
## s(power)         1.000  1.000 6.128e+00  0.0137 *  
## s(fierceness)    1.000  1.000 5.600e-01  0.4545    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## R-sq.(adj) =      1   Deviance explained =  100%
## GCV =  49.69  Scale est. = 48.198    n = 410

And the GAM shows linear relationships.

plot(model, select = 5)

plot(model, select = 1)

The linear model shows high correlation and low residuals.

model <- lm(mind ~ intellect + hardiness, data = normalized_df)
summary(model)
## 
## Call:
## lm(formula = mind ~ intellect + hardiness, data = normalized_df)
## 
## Residuals:
##    Min     1Q Median     3Q    Max 
## -9.556 -4.613 -0.497  4.257 94.315 
## 
## Coefficients:
##              Estimate Std. Error t value Pr(>|t|)    
## (Intercept) 41.531053   0.790720   52.52   <2e-16 ***
## intellect   14.953610   0.002235 6689.19   <2e-16 ***
## hardiness    2.995874   0.002104 1424.02   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 6.965 on 407 degrees of freedom
## Multiple R-squared:      1,  Adjusted R-squared:      1 
## F-statistic: 4.476e+07 on 2 and 407 DF,  p-value: < 2.2e-16

And looks like this.

The residuals suggest that some of the error may be due to rounding.

Conclusion

Mind is roughly captured by the following formula, \(mind \approx 42 + 15 * intellect + 3 * hardiness\).