Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data

Samuel B. Green, Nickalus Redell, Marilyn Thompson, Roy Levy

Research output: Contribution to journalArticlepeer-review

33 Scopus citations

Abstract

Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA (T-PA) methods for factor analysis of tetrachoric correlations between items with binary responses. We manipulated five data generation factors: number of observations, type of factor model, factor loadings, correlation between factors, and distribution of thresholds. The R-PA method tended to be more accurate than T-PA, although not uniformly across conditions. R-PA tended to perform better relative to T-PA if the underlying model (a) was unidimensional but had some unique items, (b) had highly correlated factors, or (c) had a general factor as well as a group factor. In addition, R-PA tended to outperform T-PA if items had higher factor loadings and sample size was large. A major disadvantage of the T-PA method was that it frequently yielded inflated Type I error rates.

Original languageEnglish (US)
Pages (from-to)5-21
Number of pages17
JournalEducational and Psychological Measurement
Volume76
Issue number1
DOIs
StatePublished - Feb 1 2016

Keywords

  • binary data
  • factor analysis
  • parallel analysis
  • revised parallel analysis

ASJC Scopus subject areas

  • Education
  • Developmental and Educational Psychology
  • Applied Psychology
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data'. Together they form a unique fingerprint.

Cite this