Nonconvergence, Covariance Constraints, and Class Enumeration in Growth Mixture Models

Daniel McNeish, Jeffrey R. Harring, Daniel J. Bauer

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


Growth mixture models (GMMs) are a popular method to identify latent classes of growth trajectories. One shortcoming of GMMs is nonconvergence, which often leads researchers to apply covariance equality constraints to simplify estimation, though this may be a dubious assumption. Alternative model specifications have been proposed to reduce nonconvergence without imposing covariance equality constraints. These methods perform well when the correct number of classes is known, but research has not yet examined their use when the number of classes is unknown. Given the importance of selecting the number of classes, more information about class enumeration performance is crucial to assess the potential utility of these methods. We conducted an extensive simulation to explore class enumeration and classification accuracy of model specifications that are more robust to nonconvergence. Results show that the typical approach of applying covariance equality constraints performs quite poorly. Instead, we recommended covariance pattern GMMs because they (a) had the highest convergence rates, (b) were most likely to identify the correct number of classes, and (c) had the highest classification accuracy in many conditions, even with modest sample sizes.

Original languageEnglish (US)
JournalPsychological Methods
StateAccepted/In press - 2022


  • Covariance pattern growth mixture model
  • Group based trajectory model
  • Growth mixture model
  • Latent class growth modeling

ASJC Scopus subject areas

  • Psychology (miscellaneous)


Dive into the research topics of 'Nonconvergence, Covariance Constraints, and Class Enumeration in Growth Mixture Models'. Together they form a unique fingerprint.

Cite this