Skip to main content

Feature Subset Selection Approach by Gray-Wolf Optimization

  • Conference paper
Afro-European Conference for Industrial Advancement

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 334))

Abstract

Feature selection algorithm explores the data to eliminate noisy, irrelevant, redundant data, and simultaneously optimize the classification performance. In this paper, a classification accuracy-based fitness function is proposed by gray-wolf optimizer to find optimal feature subset. Gray-wolf optimizer is a new evolutionary computation technique which mimics the leadership hierarchy and hunting mechanism of gray wolves in nature. The aim of the gray wolf optimization is find optimal regions of the complex search space through the interaction of individuals in the population. Compared with particle swarm optimization (PSP) and Genetic Algorithms (GA) over a set of UCI machine learning data repository, the proposed approach proves better performance in both classification accuracy and feature size reduction. Moreover, the gray wolf optimization approach proves much robustness against initialization in comparison with PSO and GA optimizers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yang, C.-H., Tu, C.-J., Chang, J.-Y., Liu, H.-H., Ko, P.-C.: Dimensionality Reduction using GA-PSO. In: Proceedings of the Joint Conference on Information Sciences (JCIS), October 8-11. Atlantis Press, Kaohsiung (2006)

    Google Scholar 

  2. Cannas, L.M.: ’A framework for feature selection in high-dimensional domains. Ph.D. Thesis, University of Cagliari (2012)

    Google Scholar 

  3. Dash, M., Liu, H.: Feature selection for Classification. Intelligent Data Analysis 1(3), 131–156 (1997)

    Article  Google Scholar 

  4. Jensen, R.: Combining rough and fuzzy sets for feature selection. Ph.D. Thesis, University of Edinburgh (2005)

    Google Scholar 

  5. Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer, Boston (1998)

    Book  MATH  Google Scholar 

  6. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey Wolf Optimizer. Adv. Eng. Softw. 69, 46–61 (2014)

    Article  Google Scholar 

  7. Zhong, N., Dong, J.Z.: Using rough sets with heuristics for feature selection. J. Intell. Inform. Systems 16, 199–214 (2001)

    Article  MATH  Google Scholar 

  8. Raymer, M.L., Punch, W.E., Goodman, E.D., et al.: Dimensionality reduction using genetic algorithms. IEEE Trans. Evol. Comput. 4(2), 164–171 (2000)

    Article  Google Scholar 

  9. Lai, C., Reinders, M.J.T., Wessels, L.: Random subspace method for multivariate feature selection. Pattern Recognition Lett. 27, 1067–1076 (2006)

    Article  Google Scholar 

  10. Kohavi, R.: Feature subset selection using the wrapper method, Overfitting and dynamic search space topology. In: Proc. AAAI Fall Symposium on Relevance, pp. 109–113 (1994)

    Google Scholar 

  11. Gasca, E., Sanchez, J.S., Alonso, R.: Eliminating redundancy and irrelevance using a new MLP-based feature selection method. Pattern Recognition 39(2), 313–315 (2006)

    Article  MATH  Google Scholar 

  12. Neumann, J., Schnorr, C., Steidl, G.: Combined SVM-based feature selection and classification. Machine Learning 61, 129–150 (2005)

    Article  MATH  Google Scholar 

  13. Kira, K., Rendell, L.A.: The feature selection problem: Traditional methods and a new algorithm. In: Proc. AAAI 1992, San Jose, CA, pp. 129–134 (1992)

    Google Scholar 

  14. Modrzejewski, M.: Feature selection using rough sets theory. In: Proceedings of the European Conference on Machine Learning, Vienna, Austria, pp. 213–226 (1993)

    Google Scholar 

  15. Dash, M., Liu, H.: Consistency-based search in feature selection. Artif. Intell. 151, 155–176 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  16. Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Machine Learning 46, 389–422 (2002)

    Article  MATH  Google Scholar 

  17. Xie, Z.-X., Hu, Q.-H., Yu, D.-R.: Improved feature selection algorithm based on SVM and correlation. In: Wang, J., Yi, Z., Żurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3971, pp. 1373–1380. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  18. Yao, Y.Y.: Information-theoretic measures for knowledge discovery and data mining. In: Karmeshu (ed.) Entropy Measures, Maximum Entropy and Emerging Applications, pp. 115–136. Springer, Berlin (2003)

    Chapter  Google Scholar 

  19. Deogun, J.S., Raghavan, V.V., Sever, H.: Rough set based classication methods and extended decision tables. In: Proc. of the Int. Workshop on Rough Sets and Soft Computing, pp. 302–309 (1994)

    Google Scholar 

  20. Zhang, M., Yao, J.T.: A rough sets based approach to feature selection. In: IEEE Annual Meeting of the Fuzzy Information, Processing NAFIPS 2004, June 27-30, vol. 1, pp. 434–439 (2004)

    Google Scholar 

  21. Hu, X.: Knowledge discovery in databases: an attribute-oriented rough set approach. PhD thesis, University of Regina, Canada (1995)

    Google Scholar 

  22. Blackwell, T., Branke, J.: Multiswarms, “exclusion, and anti-convergence in dynamic environments”. IEEE Transactions on Evolutionary Computation 10, 459–472 (2006)

    Article  Google Scholar 

  23. Parrott, D., Li, X.D.: Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transactions on Evolutionary Computation 10, 440–458 (2006)

    Article  Google Scholar 

  24. Yang, X.-S.: A New Metaheuristic Bat-Inspired Algorithm. In: González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N. (eds.) NICSO 2010. SCI, vol. 284, pp. 65–74. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  25. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey Wolf Optimizer. Advances in Engineering Software 69, 46–61 (2014)

    Article  Google Scholar 

  26. Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimisation for feature selection in classification:Novel initialisation and updating mechanisms. Applied Soft Computing 18, 261–276 (2014)

    Article  Google Scholar 

  27. Bache, K., Lichman, M.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2013), http://archive.ics.uci.edu/ml

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Emary, E., Zawbaa, H.M., Grosan, C., Hassenian, A.E. (2015). Feature Subset Selection Approach by Gray-Wolf Optimization. In: Abraham, A., Krömer, P., Snasel, V. (eds) Afro-European Conference for Industrial Advancement. Advances in Intelligent Systems and Computing, vol 334. Springer, Cham. https://doi.org/10.1007/978-3-319-13572-4_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-13572-4_1

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-13571-7

  • Online ISBN: 978-3-319-13572-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics