Skip to main content
Log in

A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of Extreme Learning Machines for classification problems

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Extreme Learning Machine (ELM) is a learning algorithm proposed recently to train single hidden layer feed forward networks (SLFN). It has many attractive properties that include better generalization performance and very fast learning. ELM starts by assigning random values to the input weights and hidden biases and then in one step it determines the output weights using Moore-Penrose generalized inverse. Despite the aforementioned advantages, ELM performance might be affected by the random initialization of weights and biases or by the large generated network which might contain unnecessary number of neurons. In order to increase the generalization performance and to produce more compact networks, a hybrid model that combines ELM with competitive swarm optimizer (CSO) is proposed in this paper. The proposed model (CSONN-ELM) optimizes the weights and biases and dynamically determines the most appropriate number of neurons. To evaluate the effectiveness of the CSONN-ELM, it is experimented using 23 benchmark datasets, and compared to a set of static rules extracted from literature that are used to determine the number of neurons of SLFN. Moreover, it is compared to two dynamic methods that are used to enhance the performance of ELM, that are Optimally pruned ELM (OP-ELM) and metaheuristic based ELMs (Particle Swarm Optimization-ELM and Differential Evolution-ELM). The obtained results show that the proposed method enhances the generalization performance of ELM and overcomes the static and dynamic methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Adnan RM, Liang Z, Trajkovic S, Zounemat-Kermani M, Li B, Kisi O (2019) Daily streamflow prediction using optimally pruned extreme learning machine. J Hydrol 123981

  2. Akusok A, Björk KM, Miche Y, Lendasse A (2015) High-performance extreme learning machines: a complete toolbox for big data applications. IEEE Access 3:1011–1025

    Google Scholar 

  3. Alencar AS, Neto ARR, Gomes JPP (2016) A new pruning method for extreme learning machines via genetic algorithms. Appl Soft Comput 44:101–107

    Google Scholar 

  4. Alshamiri AK, Singh A, Surampudi BR (2017) Two swarm intelligence approaches for tuning extreme learning machine. Int J Mach Learn Cybern 1–13

  5. Van den Bergh F, Engelbrecht AP (2004) A cooperative approach to particle swarm optimization. IEEE Trans Evol Comput 8(3):225–239

    Google Scholar 

  6. ten Braake H, van Can H, van Straten G, Verbruggen HB (1996) Regulated activation weights neural network (rawn). In: Proceedings of 4th European Symposium on Artificial Neural Networks, ESANN’96, Brugge, Belgium, pp 19–24

  7. Cao W, Wang X, Ming Z, Gao J (2018) A review on neural networks with random weights. Neurocomputing 275:278–287

    Google Scholar 

  8. Chen WN, Zhang J, Lin Y, Chen N, Zhan ZH, Chung HSH, Li Y, Shi YH (2013) Particle swarm optimization with an aging leader and challengers. IEEE Trans Evol Comput 17(2):241–258

    Google Scholar 

  9. Cheng R, Jin Y (2015) A competitive swarm optimizer for large scale optimization. IEEE Trans Cybern 45(2):191–204

    Google Scholar 

  10. Crawford B, Soto R, Astorga G, García J, Castro C, Paredes F (2017) Putting continuous metaheuristics to work in binary search spaces. Complexity 2017

  11. Dai B, Gu C, Zhao E, Zhu K, Cao W, Qin X (2019) Improved online sequential extreme learning machine for identifying crack behavior in concrete dam. Adv Struct Eng 22(2):402–412

    Google Scholar 

  12. Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18

    Google Scholar 

  13. Eshtay M, Faris H, Obeid N (2018) Improving extreme learning machine by competitive swarm optimization and its application for medical diagnosis problems. Expert Syst Appl 104:134–152

    Google Scholar 

  14. Eshtay M, Faris H, Obeid N (2019) Metaheuristic-based extreme learning machines: a review of design formulations and applications. Int J Mach Learn Cybern 10(6):1543–1561

    Google Scholar 

  15. Feng G, Huang GB, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1352–1357

    Google Scholar 

  16. Freire A, Barreto G (2014) A new model selection approach for the elm network using metaheuristic optimization. In: European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN)

  17. Han F, Yao HF, Ling QH (2013) An improved evolutionary extreme learning machine based on particle swarm optimization. Neurocomputing 116:87–93

    Google Scholar 

  18. Hecht-Nielsen R (1987) Kolmogorov?s mapping neural network existence theorem. In: Proceedings of the international conference on Neural Networks, vol 3. IEEE Press, New York, pp 11–13

  19. Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16):3460–3468

    Google Scholar 

  20. Huang GB, Li MB, Chen L, Siew CK (2008) Incremental extreme learning machine with fully complex hidden nodes. Neurocomputing 71(4):576–583

    Google Scholar 

  21. Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B (Cybernetics) 42(2):513–529

    Google Scholar 

  22. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501

    Google Scholar 

  23. Hush DR (1989) Classification with neural networks: a performance analysis. In: Systems engineering, 1989., IEEE international conference on. IEEE, pp 277–280

  24. Kaastra I, Boyd M (1996) Designing a neural network for forecasting financial and economic time series. Neurocomputing 10(3):215–236

    Google Scholar 

  25. Kanellopoulos I, Wilkinson G (1997) Strategies and best practice for neural network image classification. Int J Remote Sens 18(4):711–725

    Google Scholar 

  26. Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. In: Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on, vol 5. IEEE, pp 4104–4108

  27. Lichman M (2013) UCI machine learning repository. http://archive.ics.uci.edu/ml

  28. Luo X, Jiang C, Wang W, Xu Y, Wang JH, Zhao W (2019) User behavior prediction in social networks using weighted extreme learning machine with distribution optimization. Fut Gener Comput Syst 93:1023–1035

    Google Scholar 

  29. Maimaitiyiming M, Sagan V, Sidike P, Kwasniewski MT (2019) Dual activation function-based extreme learning machine (elm) for estimating grapevine berry yield and quality. Remote Sens 11(7):740

    Google Scholar 

  30. Masters T (1993) Practical neural network recipes in C++. Morgan Kaufmann, Burlington

    MATH  Google Scholar 

  31. Matias T, Souza F, Araújo R, Antunes CH (2014) Learning of a single-hidden layer feedforward neural network using an optimized extreme learning machine. Neurocomputing 129:428–436

    Google Scholar 

  32. Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A (2010) Op-elm: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21(1):158–162

    Google Scholar 

  33. Mirjalili S, Lewis A (2013) S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Google Scholar 

  34. Mohapatra P, Chakravarty S, Dash PK (2015) An improved cuckoo search based extreme learning machine for medical data classification. Swarm Evol Comput 24:25–49

    Google Scholar 

  35. Nahvi B, Habibi J, Mohammadi K, Shamshirband S, Al Razgan OS (2016) Using self-adaptive evolutionary algorithm to improve the performance of an extreme learning machine for estimating soil temperature. Comput Electron Agric 124:150–160

    Google Scholar 

  36. Niu P, Ma Y, Li M, Yan S, Li G (2016) A kind of parameters self-adjusting extreme learning machine. Neural Process Lett 44(3):813–830

    Google Scholar 

  37. Niu Wj, Feng Zk, Cheng Ct, Zhou Jz (2018) Forecasting daily runoff by extreme learning machine based on quantum-behaved particle swarm optimization. J Hydrol Eng 23(3):04018002

    Google Scholar 

  38. de Oliveira JFL, Ludermir TB (2012) An evolutionary extreme learning machine based on fuzzy fish swarms. In: Proceedings on the International Conference on Artificial Intelligence (ICAI). The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp), p 1

  39. Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2):163–180

    Google Scholar 

  40. Pao YH, Takefuji Y (1992) Functional-link net computing: theory, system architecture, and functionalities. Computer 25(5):76–79

    Google Scholar 

  41. Ripley BD (1993) Statistical aspects of neural networks. Networks and chaos?statistical and probabilistic aspects 50:40–123

  42. Sánchez-Monedero J, Hervas-Martinez C, Gutiérrez P, Ruz MC, Moreno MR, Cruz-Ramirez M (2010) Evaluating the performance of evolutionary extreme learning machines by a combination of sensitivity and accuracy measures. Neural Netw World 20(7):899

    Google Scholar 

  43. Sattar AM, Ertuğrul ÖF, Gharabaghi B, McBean EA, Cao J (2019) Extreme learning machine model for water network management. Neural Comput Appl 31(1):157–169

    Google Scholar 

  44. Schmidt WF, Kraaijveld MA, Duin RP (1992) Feedforward neural networks with random weights. In: Pattern Recognition, 1992. Vol. II. Conference B: Pattern Recognition Methodology and Systems, Proceedings., 11th IAPR International Conference on. IEEE, pp 1–4

  45. Silva DN, Pacifico LD, Ludermir TB (2011) An evolutionary extreme learning machine based on group search optimization. In: Evolutionary computation (CEC), 2011 IEEE congress on. IEEE, pp 574–580

  46. Sun C, Ding J, Zeng J, Jin Y (2016) A fitness approximation assisted competitive swarm optimizer for large scale expensive optimization problems. Memetic Computing, pp 1–12

  47. Te Braake HA, Van Straten G (1995) Random activation weight neural net (rawn) for east non-iterative training. Eng Appl Artif Intell 8(1):71–80

    Google Scholar 

  48. Wang C (1994) A theory of generalization in learning machines with neural network applications. PhD Thesis

  49. Wang X, Cao W (2018) Non-iterative approaches in training feed-forward neural networks and their applications

  50. Wang X, Zhang T, Wang R (2019) Non-Iterative Deep Learning: Incorporating Restricted Boltzmann Machine into Multilayer Random Weight Neural Networks. IEEE Trans Syst Man Cybern Syst 49(7):1299–1380

    Google Scholar 

  51. Wang Z, Wang X (2018) A deep stochastic weight assignment network and its application to chess playing. J Parallel Distrib Comput 117:205–211

    Google Scholar 

  52. Xu Y, Shu Y (2006) Evolutionary extreme learning machine-based on particle swarm optimization. Adv Neural Netw-ISNN 2006:644–652

    Google Scholar 

  53. Xue B, Ma X, Gu J, Li Y (2013) An improved extreme learning machine based on variable-length particle swarm optimization. In: Robotics and Biomimetics (ROBIO), 2013 IEEE International Conference on. IEEE, pp 1030–1035

  54. Yang Y, Pedersen JO (1997) A comparative study on feature selection in text categorization. Icml 97:412–420

    Google Scholar 

  55. Zhang Y, Cai Z, Wu J, Wang X, Liu X (2015) A memetic algorithm based extreme learning machine for classification. In: Neural Networks (IJCNN), 2015 International Joint Conference on Neural Networks. IEEE, pp 1–8

  56. Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recognit 38(10):1759–1763

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hossam Faris.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Eshtay, M., Faris, H. & Obeid, N. A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of Extreme Learning Machines for classification problems. Int. J. Mach. Learn. & Cyber. 11, 1801–1823 (2020). https://doi.org/10.1007/s13042-020-01073-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-020-01073-y

Keywords

Navigation