Back to Lexicon

Parsimony principle

\ ˈpɑrsəˌmoʊni ˈ\ prɪnsəpəl \

The parsimony principle for a statistical model states that: a simpler model with fewer parameters is favored over more complex models with more parameters, provided the models fit the data similarly well.

The number of parameters in a model can be reduced by either:

  • reducing covariates to a simpler form; and/or
  • omitting any unnecessary variables.

For an example of reducing a covariate to a simpler form, imagine the inclusion of a 50-level discrete covariate in a regression model describing the geographical location of an annuitant based upon which US state they live in. If mortality experience can be described equally as well with the inclusion of a simpler US geographical covariate (e.g. NorthEast, MidWest, South, West) then this model has 46 fewer parameters and is then preferred over the more complex model.

For the purposes of assessing parsimony we need measures of goodness of fit which penalise for complexity.

When assessing parsimony for the VitaCurves model, Club Vita focus on three methods: the Akaike Information Criterion (AIC), the Bayesian Information Criterion (BIC), and the Hosmer & Lemeshow (H&L) test.

Keep exploring our Lexicon of Longevity
Back to Lexicon
Icon/Arrow/UpIcon/Pin/Calander12Icon/Close/blackIcon/Social/FacebookFlag/CanadaFlag/wolrdFlag/ukFlag/usaIcon/Social/LinkedinIcon/MinusIcon/PinIcon/ExpandIcon/QuoteIcon/Website-greenIcon/Website/grey