Abstract
Growth mixture models (GMMs) are a popular method to identify latent classes of growth trajectories. One shortcoming of GMMs is nonconvergence, which often leads researchers to apply covariance equality constraints to simplify estimation, though this may be a dubious assumption. Alternative model specifications have been proposed to reduce nonconvergence without imposing covariance equality constraints. These methods perform well when the correct number of classes is known, but research has not yet examined their use when the number of classes is unknown. Given the importance of selecting the number of classes, more information about class enumeration performance is crucial to assess the potential utility of these methods. We conducted an extensive simulation to explore class enumeration and classification accuracy of model specifications that are more robust to nonconvergence. Results show that the typical approach of applying covariance equality constraints performs quite poorly. Instead, we recommended covariance pattern GMMs because they (a) had the highest convergence rates, (b) were most likely to identify the correct number of classes, and (c) had the highest classification accuracy in many conditions, even with modest sample sizes.
Original language | English (US) |
---|---|
Journal | Psychological Methods |
DOIs | |
State | Accepted/In press - 2022 |
Keywords
- Covariance pattern growth mixture model
- Group based trajectory model
- Growth mixture model
- Latent class growth modeling
ASJC Scopus subject areas
- Psychology (miscellaneous)