longitudinal measurement invariance doesnt work with target="stan"
maugavilla opened this issue · 5 comments
When testing longitudinal measurement invariance, target="stan" doesnt work, while "stanclassic" does work. Does work when I add the factor loadings or intercept equality labels across time
The error is gives is
SAMPLING FOR MODEL 'stanmarg' NOW (CHAIN 1).
Chain 1: Unrecoverable error evaluating the log probability at the initial value.
Chain 1: Exception: normal_lpdf: Random variable has dimension = 3, expecting dimension = 4; a function was called with arguments of different scalar, array, vector, or matrix types, and they were not consistently sized; all arguments must be scalars or multidimensional values of the same shape. (in 'model_stanmarg' at line 689)
[1] "Error in sampler$call_sampler(args_list[[i]]) : "
[2] " Exception: normal_lpdf: Random variable has dimension = 3, expecting dimension = 4; a function was called with arguments of different scalar, array, vector, or matrix types, and they were not consistently sized; all arguments must be scalars or multidimensional values of the same shape. (in 'model_stanmarg' at line 689)"
error occurred during calling the sampler; sampling not done
Stan model 'stanmarg' does not contain samples.
Error in rowvec[tmpw[, 1] == 0] <- grep(stanvec[j], names(b.est)) :
replacement has length zero
Here is an example that reproduce the error with simulated data
library(blavaan)
mod_sim <- '
f1 =~ 0.7?y1_t1 + 0.7?y2_t1 + 0.7?y3_t1
f2 =~ 0.7?y1_t2 + 0.7?y2_t2 + 0.7?y3_t2
f1 ~~ 1?f1
f2 ~~ 1.5?f2
f1 ~0?1
f2 ~0.7?1
f1 ~~ 0.4?f2
y1_t1 ~~ 0.3?y1_t2
y2_t1 ~~ 0.3?y2_t2
y3_t1 ~~ 0.3?y3_t2
'
dat_sim <- simulateData(mod_sim, sample.nobs = 300)
head(dat_sim)
mod_conf <- '
f1 =~ y1_t1 + y2_t1 + y3_t1
f2 =~ y1_t2 + y2_t2 + y3_t2
y1_t1 ~~ y1_t2
y2_t1 ~~ y2_t2
y3_t1 ~~ y3_t2
'
fit_conf <- bcfa(mod_conf, data=dat_sim, std.lv=T)
summary(fit_conf)
mod_weak <- '
f1 =~ l1*y1_t1 + l2*y2_t1 + l3*y3_t1
f2 =~ l1*y1_t2 + l2*y2_t2 + l3*y3_t2
f1 ~~ 1*f1
f2 ~~ NA*f2
y1_t1 ~~ y1_t2
y2_t1 ~~ y2_t2
y3_t1 ~~ y3_t2
'
fit_weak <- bcfa(mod_weak, data=dat_sim, std.lv=T)
summary(fit_weak)
mod_strong <- '
f1 =~ l1*y1_t1 + l2*y2_t1 + l3*y3_t1
f2 =~ l1*y1_t2 + l2*y2_t2 + l3*y3_t2
f1 ~~ 1*f1
f2 ~~ NA*f2
f1 ~0*1
f2 ~NA*1
y1_t1 ~~ y1_t2
y2_t1 ~~ y2_t2
y3_t1 ~~ y3_t2
y1_t1 ~i1*1
y2_t1 ~i2*1
y3_t1 ~i3*1
y1_t2 ~i1*1
y2_t2 ~i2*1
y3_t2 ~i3*1
'
fit_strong <- bcfa(mod_strong, data=dat_sim, std.lv=T)
summary(fit_strong)
Thanks for the report. This is a conflict between std.lv=TRUE and freely estimating an lv variance. I will have to dig to see how to fix it. In the meantime, it is possible to estimate the model by using blavaan() and avoiding std.lv=TRUE:
fit_weak <- blavaan(mod_weak, data=dat_sim, auto.fix.first=F, auto.var=T,
int.ov.free=T, auto.cov.lv.x=T)
The model also appears to work with the other targets (jags, stanclassic, and there is a new "stancond" on github that is not fully finished).
Good to see there is a way around it with blavaan(). I did tested with the other jags and stanclassic targets and worked fine.
Did see the stancond in the github changes. What is this target for?
Thanks
It treats latent variables as parameters, like stanclassic, but the code is simpler and it is often more efficient sampling.
I think it is now fixed, but let me know if not.
yes, I just run the test, it works. Thanks