Login / Signup

A Note on the Connection Between Trek Rules and Separable Nonlinear Least Squares in Linear Structural Equation Models.

Maximilian S ErnstAaron PeikertAndreas M BrandmaierYves Rosseel
Published in: Psychometrika (2022)
We show that separable nonlinear least squares (SNLLS) estimation is applicable to all linear structural equation models (SEMs) that can be specified in RAM notation. SNLLS is an estimation technique that has successfully been applied to a wide range of models, for example neural networks and dynamic systems, often leading to improvements in convergence and computation time. It is applicable to models of a special form, where a subset of parameters enters the objective linearly. Recently, Kreiberg et al. (Struct Equ Model Multidiscip J 28(5):725-739, 2021. https://doi.org/10.1080/10705511.2020.1835484 ) have shown that this is also the case for factor analysis models. We generalize this result to all linear SEMs. To that end, we show that undirected effects (variances and covariances) and mean parameters enter the objective linearly, and therefore, in the least squares estimation of structural equation models, only the directed effects have to be obtained iteratively. For model classes without unknown directed effects, SNLLS can be used to analytically compute least squares estimates. To provide deeper insight into the nature of this result, we employ trek rules that link graphical representations of structural equation models to their covariance parametrization. We further give an efficient expression for the gradient, which is crucial to make a fast implementation possible. Results from our simulation indicate that SNLLS leads to improved convergence rates and a reduced number of iterations.
Keyphrases
  • neural network
  • healthcare
  • primary care
  • quality improvement
  • binding protein
  • virtual reality