0
$\begingroup$

I had used the already published Likert scale for the survey. And the responses to the survey from 98 participants were collected. The survey likert scale was from 1-5 from strongly diasgree to srongle agree.

Looking at the variables the average value of one of the factors is above the 3 for all the questions. The figure below is the avg of the responses.

enter image description here

But while evaluating the variances the estimate, std.lv are valued seems to be negatives.

                  Estimate  Std.Err  z-value  P(>|z|)   Std.lv  Std.all

.Competence       -0.188    0.105   -1.796    0.073   -0.324   -0.324

and it is giving the warning:

lavaan WARNING: some estimated lv variances are negative

Model i am using:

model <- '
  Competence =~ COMP1 + COMP2 + COMP3 + COMP4
  Autonomy =~  AUT1 + AUT2 + AUT3
  Relatedness =~ REL1 + REL2 + REL3 + REL4
  Motivation =~ Autonomy + Relatedness + Competence
  
  Vigor =~ VIGOR1 + VIGOR2 + VIGOR3  +VIGOR4 + VIGOR5
  Dedication =~ DED1 +DED2 +DED3 +DED4 +DED5
  Absorption =~ ABS1 +ABS2 +ABS3 +ABS4
  Engagement =~ Vigor + Dedication + Absorption
  
  Motivation ~~ Engagement
  
  '

fit <- sem(model,data = Log_And_SurveyResult)

summary(fit, standardized=T)

However, what these variables predict appears to be significant with other variables i.e Motivation and Engagement seems to be co-related.

Now, due to the value of negative in the estimate, I am confused about how to interpret the result?

I can add further information if need to answer the question.

Also, in the output of the LavaanPlot, the loadings are high.

enter image description here

I am stuck in the interpretation for many days. Any help will be appreciated.

Thank you.

$\endgroup$
3
  • 1
    $\begingroup$ I think you misread the output. The only quantity proportional to a variance is the standard error of estimate, abbreviated Std.Err, and indeed it is non-negative, as it must be. Have you consulted your software documentation for information about what the output means? $\endgroup$ Commented Sep 23, 2020 at 14:04
  • $\begingroup$ @whuber it is producing the warning as lavaan WARNING: some estimated lv variances are negative. and also I have updated the value of loadings and the value is 3.58. $\endgroup$ Commented Sep 23, 2020 at 14:11
  • 5
    $\begingroup$ That's a completely different issue. As you know, a true variance cannot be negative, so when a program computes a variance and detects a negative value, it knows something has gone wrong. But exactly what has gone wrong depends on the program and on the details of what it is doing. Often, the important clues are in the exact details of the error message and when in your workflow it occurs. That's the kind of information that would help people give you good answers. $\endgroup$ Commented Sep 23, 2020 at 14:38

1 Answer 1

1
$\begingroup$

There may be a few issues going on. The first thing that comes to me is that perhaps your estimator is incorrect. It looks like you've used the default maximum likelihood estimator, but this has some specific assumptions that may not be met with Likert scales. You may check using the WLMSV estimator instead. Also, it looks like you're doing a factor analysis on the scale, so instead of calling sem() you might just want to use cfa(). It shouldn't affect your results a lot, but the cfa() function has some useful default arguments for when the goal is just a factor analysis.

Some other issues that you might want to consider is that your sample is too small and/or that there is too much collinearity in the data. I'm not terribly surprised by finding a Heywood case in this model since you're fitting a hierarchical factor analysis on just 98 people. I'd just go through some assumption checking if an alternative estimator doesn't fix the problem.

Another possible issue is just that the model is misspecified. You might consider exploratory factor analysis and see if that results in some better behaved models if nothing else works.

$\endgroup$
2
  • $\begingroup$ WLMSV estimator elimantes the negative value. and the loadings on the plot has also decreased, but what about the Heywood cases? can you add something on how it could be that case? $\endgroup$ Commented Sep 23, 2020 at 15:08
  • 1
    $\begingroup$ Heywood cases occur when an item has a standardized loading > 1 and a negative error variance. They occur when there is inadequate data to estimate parameters, non-normal data or data with outliers, misspecified models (often too many factors extracted), or when the parameter estimate is close to the boundary in the population. Maximum likelihood estimation in factor analysis is particularly vulnerable to Heywood cases while other estimators tend to be OK since they don't rely on the same assumptions (e.g., normality) $\endgroup$ Commented Sep 23, 2020 at 15:52

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.