skip to main content
10.1145/1273496.1273641acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
Article

Spectral feature selection for supervised and unsupervised learning

Published:20 June 2007Publication History

ABSTRACT

Feature selection aims to reduce dimensionality for building comprehensible learning models with good generalization performance. Feature selection algorithms are largely studied separately according to the type of learning: supervised or unsupervised. This work exploits intrinsic properties underlying supervised and unsupervised feature selection algorithms, and proposes a unified framework for feature selection based on spectral graph theory. The proposed framework is able to generate families of algorithms for both supervised and unsupervised feature selection. And we show that existing powerful algorithms such as ReliefF (supervised) and Laplacian Score (unsupervised) are special cases of the proposed framework. To the best of our knowledge, this work is the first attempt to unify supervised and unsupervised feature selection, and enable their joint study under a general framework. Experiments demonstrated the efficacy of the novel algorithms derived from the framework.

References

  1. Chapelle, O., Schölkopf, B., and Zien, A. (Eds.) (2006). Semi-supervised learning, chapter Graph-Based Methods. The MIT Press.Google ScholarGoogle Scholar
  2. Chung, F. (1997). Spectral graph theory. AMS.Google ScholarGoogle Scholar
  3. Dy, J., & Brodley, C. E. (2004). Feature selection for unsupervised learning. JMLR., 5, 845--889. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. JMLR., 3, 1157--1182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. He, X., Cai, D., & Niyogi, P. (2005). Laplacian score for feature selection. In NIPS. MIT Press.Google ScholarGoogle Scholar
  6. Kondor, R. I., & Lafferty, J. (2002). Diffusion kernels on graphs and other discrete structures. ICML. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Lanckriet, G. R. G., Cristianini, N., Bartlett, P., Ghaoui, L. E., & Jordan, M. I. (2004). Learning the kernel matrix with semidefinite programming. JMLR., 5, 27--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Liu, H., & Yu, L. (2005). Toward integrating feature selection algorithms for classification and clustering. IEEE TKDE, 17, 491--502. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Ng, A., Jordan, M., & Weiss, Y. (2001). On spectral clustering: Analysis and an algorithm. NIPS.Google ScholarGoogle Scholar
  10. Lehoucq, R. B. (2001). Implicitly Restarted Arnoldi Methods and Subspace Iteration. SIAM J. Matrix Anal. Appl. 23, 551--562. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Robnik-Sikonja, M., & Kononenko, I. (2003). Theoretical and empirical analysis of Relief and ReliefF. Machine Learning, 53, 23--69. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Shi, J., & Malik, J. (1997). Normalized cuts and image segmentation. CVPR. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Smola, A., & Kondor, I. (2003). Kernels and regularization on graphs. COLT.Google ScholarGoogle Scholar
  14. Wolf, L., & Shashua, A. (2005). Feature selection for unsupervised and supervised inference: The emergence of sparsity in a weight-based approach. JMLR., 6, 1855--1887. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Zhang, T., & Ando, R. (2006). Analysis of spectral kernel design based semi-supervised learning. NIPS.Google ScholarGoogle Scholar
  16. Zhao, Z., & Liu, H. (2007). Semi-supervised Feature Selection via Spectral Analysis. SDM.Google ScholarGoogle Scholar
  1. Spectral feature selection for supervised and unsupervised learning

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICML '07: Proceedings of the 24th international conference on Machine learning
      June 2007
      1233 pages
      ISBN:9781595937933
      DOI:10.1145/1273496

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 20 June 2007

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate140of548submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader