Skip to main content
Log in

Approximated Penalized Maximum Likelihood for Exploratory Factor Analysis: An Orthogonal Case

  • Published:
Psychometrika Aims and scope Submit manuscript

Abstract

The problem of penalized maximum likelihood (PML) for an exploratory factor analysis (EFA) model is studied in this paper. An EFA model is typically estimated using maximum likelihood and then the estimated loading matrix is rotated to obtain a sparse representation. Penalized maximum likelihood simultaneously fits the EFA model and produces a sparse loading matrix. To overcome some of the computational drawbacks of PML, an approximation to PML is proposed in this paper. It is further applied to an empirical dataset for illustration. A simulation study shows that the approximation naturally produces a sparse loading matrix and more accurately estimates the factor loadings and the covariance matrix, in the sense of having a lower mean squared error than factor rotations, under various conditions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Bernaards, C. A., & Jennrich, R. I. (2005). Gradient projection algorithms and software for arbitrary rotation criteria in factor analysis. Educational and Psychological Measurement, 65, 676–696.

    Article  Google Scholar 

  • Björck, A. (1996). Numerical methods for least squares problems. Philadelphia, PA: SIAM.

    Book  Google Scholar 

  • Brehenv, P., & Huang, J. (2011). Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection. The Annals of Applied Statistics, 5, 232–253.

    Article  Google Scholar 

  • Browne, M. V. (2001). An overview of analytic rotation in exploratory factor analysis. Multivariate Behavioral Research, 36, 111–150.

    Article  Google Scholar 

  • Browne, M. W., & Du Toit, S. H. C. (1992). Automated fitting of nonstandard models. Multivariate Behavioral Research, 27, 269–300.

    Article  PubMed  Google Scholar 

  • Carroll, J. B. (1953). An analytic rotation for approximating simple structure in factor analysis. Psychometrika, 18, 23–38.

    Article  Google Scholar 

  • Choi, J., Zou, H., & Oehlert, G. (2010). A penalized maximum likelihood approach to sparse factor analysis. Statistics and Its Interface, 3, 429–436.

    Article  Google Scholar 

  • Du Toit, M., Du Toit, S., & Hawkins, D. M. (2001). Interactive LISREL: User’s guide. Linconwood, IL: Scientific Software International.

    Google Scholar 

  • Eddelbuettel, D. (2013). Seamless R and C++ integration with Rcpp. New York: Springer.

    Book  Google Scholar 

  • Eddelbuettel, D., & François, R. (2011). Rcpp: Seamless R and C++ integration. Journal of Statistical Software, 40, 1–18.

    Google Scholar 

  • Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least angle regression. The Annals of Statistics, 32, 407–840.

    Article  Google Scholar 

  • Fan, J., & Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, 96, 1348–1360.

    Article  Google Scholar 

  • Friedman, J., Hastie, T., Höfling, H., & Tibshirani, R. (2007). Pathwise coordinate optimization. The Annals of Applied Statistics, 321, 302–332.

    Article  Google Scholar 

  • Friedman, J., Hastie, T., & Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent. Journal of Statistical Software, 33, 1–22.

    Article  PubMed  PubMed Central  Google Scholar 

  • Garcia, R. I., Ibrahim, J. G., & Zhu, H. (2010). Variable selection for regression models with missing data. Statistica Sinica, 20, 149–165.

    PubMed  PubMed Central  Google Scholar 

  • Hair, J., Black, W., Babin, B., & Anderson, R. (2010). Multivariate data analysis (7th ed.). Upper Saddle River, NJ: Prentice Hall.

    Google Scholar 

  • Hirose, K., & Konishi, S. (2012). Variable selection via the weighted group lasso for factor analysis models. Canadian Journal of Statistics, 40, 345–361.

    Article  Google Scholar 

  • Hirose, K., & Yamamoto, M. (2014). Estimation of an oblique structure via penalized likelihood factor analysis. Computational Statistics and Data Analysis, 79, 120–132.

    Article  Google Scholar 

  • Hirose, K., & Yamamoto, M. (2015). Sparse estimation via non-concave penalized likelihood in factor analysis model. Statistics and Computing, 25, 863–875.

    Article  Google Scholar 

  • Holzinger, K., & Swineford, F. (1939). A study in factor analysis: The stability of a bifactor solution. Supplementary Educational Monograph, No. 48, Chicago, IL: University of Chicago Press.

  • Hunter, D., & Li, R. (2005). Variable selection using MM algorithms. The Annals of Statistics, 33, 1617–1642.

    Article  PubMed  Google Scholar 

  • Jennrich, R. I. (2004). Rotation to simple loadings using component loss functions: The orthogonal case. Psychometrika, 69, 257–273.

    Article  Google Scholar 

  • Jennrich, R. I. (2006). Rotation to simple loadings using component loss functions: The oblique case. Psychometrika, 71, 173–191.

    Article  Google Scholar 

  • Jennrich, R. I. (2007). Rotation algorithms: From beginning to end. In S.-Y. Lee (Ed.), Handbook of latent variable and related models (pp. 45–63). Amsterdam, The Netherlands: Elsevier.

    Google Scholar 

  • Johnstone, I. M., & Lu, A. Y. (2012). On consistency and sparsity for principal components analysis in high dimensions. Journal of the American Statistical Association, 104, 682–693.

    Article  Google Scholar 

  • Jöreskog, K. G. (1967). Some contributions to maximum likelihood factor analysis. Psychometrika, 32, 443–482.

    Article  Google Scholar 

  • Jöreskog, K. G., & Sörbom, D. (1993). LISREL 8: Structural equation modeling with the SIMPLIS command language. Linconwood, IL: Scientific Software International.

    Google Scholar 

  • Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis. Psychometrika, 23, 187–240.

    Article  Google Scholar 

  • Lawley, D. N. (1940). The estimation of factor loadings by the method of maximum likelihood. Proceedings of the Royal Society of Edinburgh, 60, 64–82.

    Article  Google Scholar 

  • Mazumder, R., Friedman, J. H., & Hastie, T. (2011). SparseNet: Coordinate descent with nonconvex penalties. Journal of the American Statistical Association, 106, 1125–1138.

    Article  PubMed  PubMed Central  Google Scholar 

  • Meinshausen, N. (2007). Relaxed lasso. Computational Statistics and Data Analysis, 52, 374–393.

    Article  Google Scholar 

  • Neuhaus, J. O., & Wrigley, C. (1954). The quartimax method: An analytical approach to orthogonal simple structure. British Journal of Mathematical and Statistical Psychology, 7, 81–91.

    Article  Google Scholar 

  • Ning, L., & Georgiou, T. T. (2011, December). Sparse factor analysis via likelihood and l \(_1\)-regularization. In Decision and Control and European Control Conference (CDC-ECC), 2011 50th IEEE Conference on decision and control and european control conference (pp. 5188–5192).

  • Osborne, M. R., Presnell, B., & Turlach, B. A. (2000). On the LASSO and its dual. Journal of Computational and Graphical Statistics, 9, 319–337.

    Google Scholar 

  • Rubin, D., & Thayer, D. (1982). EM algorithms for ML factor analysis. Psychometrika, 47, 69–76.

    Article  Google Scholar 

  • Shen, H., & Huang, J. (2008). Sparse principal component analysis via regularized low rank matrix approximation. Journal of Multivariate Analysis, 99, 1015–1034.

    Article  Google Scholar 

  • Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics. Boston, MA: Allyn and Bacon.

    Google Scholar 

  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 58, 267–288.

    Google Scholar 

  • Tibshirani, R. (2011). Regression shrinkage and selection via the lasso: A retrospective. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73, 273–282.

    Article  Google Scholar 

  • Trendafilov, N. T. (2014). From simple structure to sparse components: A review. Computational Statistics, 29, 431–454.

    Article  Google Scholar 

  • Trendafilov, N. T., & Adachi, K. (2015). Sparse versus simple structure loadings. Psychometrika, 80, 776–790.

    Article  PubMed  Google Scholar 

  • Trendafilov, N. T., Fontanella, S., & Adachi, K. (2017). Sparse exploratory factor analysis. Psychometrika, 82, 778–794.

    Article  Google Scholar 

  • Witten, D. M., Tibshirani, R., & Hastie, T. (2009). A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis. Biostatistics, 10, 515–534.

    Article  PubMed  PubMed Central  Google Scholar 

  • Zhang, C. (2010). Nearly unbiased variable selection under minimax concave penalty. The Annals of Statistics, 38, 894–942.

    Article  Google Scholar 

  • Zhang, G. (2014). Estimating standard errors in exploratory factor analysis. Multivariate Behavioral Research, 49, 339–353.

    Article  PubMed  Google Scholar 

  • Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American Statistical Association, 101, 1418–1429.

    Article  Google Scholar 

  • Zou, H., Hastie, T., & Tibshirani, R. (2006). Sparse principal component analysis. Journal of Computational and Graphical Statistics, 15, 265–286.

    Article  Google Scholar 

  • Zou, H., Hastie, T., & Tibshirani, R. (2007). On the "degrees of freedom" of the lasso. The Annals of Statistics, 35, 1849–2311.

    Article  Google Scholar 

  • Zou, H., & Li, R. (2008). One-step sparse estimates in nonconcave penalized likelihood models. The Annals of Statistics, 36, 1509–1533.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Shaobo Jin and Fan Yang-Wallentin are partly supported by the Vetenskapsrådet (Swedish Research Council) under contract 2017-01175. We would like to thank the reviewers for providing valuable comments. We also would like to thank Måns Thulin for giving valuable comments on the early version of the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shaobo Jin.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jin, S., Moustaki, I. & Yang-Wallentin, F. Approximated Penalized Maximum Likelihood for Exploratory Factor Analysis: An Orthogonal Case. Psychometrika 83, 628–649 (2018). https://doi.org/10.1007/s11336-018-9623-z

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11336-018-9623-z

Keywords

Navigation