Login / Signup

Nonconvergence, covariance constraints, and class enumeration in growth mixture models.

Daniel McNeishJeffrey R HarringDaniel J Bauer
Published in: Psychological methods (2022)
Growth mixture models (GMMs) are a popular method to identify latent classes of growth trajectories. One shortcoming of GMMs is nonconvergence, which often leads researchers to apply covariance equality constraints to simplify estimation, though this may be a dubious assumption. Alternative model specifications have been proposed to reduce nonconvergence without imposing covariance equality constraints. These methods perform well when the correct number of classes is known, but research has not yet examined their use when the number of classes is unknown. Given the importance of selecting the number of classes, more information about class enumeration performance is crucial to assess the potential utility of these methods. We conducted an extensive simulation to explore class enumeration and classification accuracy of model specifications that are more robust to nonconvergence. Results show that the typical approach of applying covariance equality constraints performs quite poorly. Instead, we recommended covariance pattern GMMs because they (a) had the highest convergence rates, (b) were most likely to identify the correct number of classes, and (c) had the highest classification accuracy in many conditions, even with modest sample sizes. An analysis of empirical posttraumatic stress disorder (PTSD) data is provided to show that the typical four-class solution found in many empirical PTSD studies may be an artifact of the covariance equality constraint method that has permeated this literature. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
Keyphrases