Skip to main content

Advertisement

Log in

A Systematic Review for Human EEG Brain Signals Based Emotion Classification, Feature Extraction, Brain Condition, Group Comparison

  • Image & Signal Processing
  • Published:
Journal of Medical Systems Aims and scope Submit manuscript

Abstract

The study of electroencephalography (EEG) signals is not a new topic. However, the analysis of human emotions upon exposure to music considered as important direction. Although distributed in various academic databases, research on this concept is limited. To extend research in this area, the researchers explored and analysed the academic articles published within the mentioned scope. Thus, in this paper a systematic review is carried out to map and draw the research scenery for EEG human emotion into a taxonomy. Systematically searched all articles about the, EEG human emotion based music in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 1999 to 2016. These databases feature academic studies that used EEG to measure brain signals, with a focus on the effects of music on human emotions. The screening and filtering of articles were performed in three iterations. In the first iteration, duplicate articles were excluded. In the second iteration, the articles were filtered according to their titles and abstracts, and articles outside of the scope of our domain were excluded. In the third iteration, the articles were filtered by reading the full text and excluding articles outside of the scope of our domain and which do not meet our criteria. Based on inclusion and exclusion criteria, 100 articles were selected and separated into five classes. The first class includes 39 articles (39%) consists of emotion, wherein various emotions are classified using artificial intelligence (AI). The second class includes 21 articles (21%) is composed of studies that use EEG techniques. This class is named ‘brain condition’. The third class includes eight articles (8%) is related to feature extraction, which is a step before emotion classification. That this process makes use of classifiers should be noted. However, these articles are not listed under the first class because these eight articles focus on feature extraction rather than classifier accuracy. The fourth class includes 26 articles (26%) comprises studies that compare between or among two or more groups to identify and discover human emotion-based EEG. The final class includes six articles (6%) represents articles that study music as a stimulus and its impact on brain signals. Then, discussed the five main categories which are action types, age of the participants, and number size of the participants, duration of recording and listening to music and lastly countries or authors’ nationality that published these previous studies. it afterward recognizes the main characteristics of this promising area of science in: motivation of using EEG process for measuring human brain signals, open challenges obstructing employment and recommendations to improve the utilization of EEG process.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Abbreviations

GNB:

Gaussian naïve bayes

KNN:

K-nearest neighbors

LDA:

Linear discriminant analysis

HCI:

Human computer interface

ANN:

Artificial neural network

SVM:

Support vector machine

PNN:

Probabilistic neural network

MIR:

Music information retrieval

BCMI:

Brain computer music interfacing

DBNs:

Dynamic bayesian networks

BCI:

Brain computer interface

MLP:

Multilayer perceptron classifier

RVM:

Relevance vector machine

ERP:

Event-related potential

PCA:

Principal component analysis

DCAU:

Differential caudality

PSD:

Power spectral density

EMD:

Empirical mode decomposition

SampEn:

Sample entropy

HHT:

Hilbert–Huang transform

CG:

Control group

ECG:

Electrocardiography

NH:

Normal hearing

MFDF:

Multifractral detrended flunction

rCBF:

Regional cerebral blood flow

GA:

Genetic algorithm

References

  1. Lin, Y. P., Yang, Y. H., and Jung, T. P., Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening. Front. Neurosci. 8:1–14, 2014.

    Google Scholar 

  2. Chavan, D. R., Kumbhar, M. S., and Chavan, R. R., The human stress recognition of brain, using music therapy. 2016 Int. Conf. Comput. Power, Energy, Inf. Commun. ICCPEIC 2016, pp. 200–203, 2016.

  3. Ogawa, T., Karungarul, S., Mitsukura, Y., Fukumil, M., and Akamatsul, N., Feature extraction in listen ’ ng to Mus ;’ C using statistical analystis of the EEG. no. D, pp. 5120–5123, 2006.

  4. Naji, M., Firoozabadi, M., and Azadfallah, P., Emotion classification during music listening from forehead biosignals. Signal, Image Video Process. 9(6):1365–1375, 2015.

    Article  Google Scholar 

  5. Sandler, H., Tamm, S., Fendel, U., Rose, M., Klapp, B. F., and Bösel, R., Positive Emotional Experience: Induced by Vibroacoustic Stimulation Using a Body Monochord in Patients with Psychosomatic Disorders: Is Associated with an Increase in EEG-Theta and a Decrease in EEG-Alpha Power. Brain Topogr. 29(4):524–538, 2016.

    Article  PubMed  CAS  Google Scholar 

  6. Chang, Y.-H., Lee, Y.-Y., Liang, K.-C., Chen, I.-P., Tsai, C.-G., and Hsieh, S., Experiencing affective music in eyes-closed and eyes-open states: an electroencephalography study. Front. Psychol. 6:1160–1168, 2015.

    PubMed  PubMed Central  Google Scholar 

  7. Kemalasari ve Purnomo, M. H., Analysis the dominant location of brain activity in frontal lobe using K-means method. Int. Conf. Instrumentation, Commun. Inf. Technol. Biomed. Eng. 2009, ICICI-BME 2009, pp. 8–10, 2009.

  8. Lin, Y. P., Wang, C. H., Wu, T. L., Jeng, S. K., and Chen, J. H., Multilayer perceptron for EEG signal classification during listening to emotional music. IEEE Reg. 10 Annu. Int. Conf. Proceedings/TENCON, 2007.

  9. Murugappan, M., and Murugappan, S., Human emotion recognition through short time Electroencephalogram (EEG) signals using Fast Fourier Transform (FFT). Signal Process. its Appl. (CSPA), 2013 IEEE 9th Int. Colloq., pp. 289–294, 2013.

  10. Morita Y., Huang, H. H., and Kawagoe, K., Towards music information retrieval driven by EEG signals: Architecture and preliminary experiments. 2013 IEEE/ACIS 12th Int. Conf. Comput. Inf. Sci. ICIS 2013 - Proc., pp. 213–217, 2013.

  11. Bhardwaj, A., Gupta, A., Jain, P., Rani, A., and Yadav, J., Classification of human emotions from EEG signals using SVM and LDA Classifiers. 2015 2nd Int. Conf. Signal Process. Integr. Networks, pp. 180–185, 2015.

  12. Daly, I. et al., Identifying music-induced emotions from EEG for use in brain-computer music interfacing. 2015 Int. Conf. Affect. Comput. Intell. Interact. ACII 2015 22:923–929, 2015.

    Google Scholar 

  13. Navea, R. F., and Dadios, E., Classification of tone stimulated EEG signals using independent components and power spectrum vectors. 8th Int. Conf. Humanoid, Nanotechnology, Inf. Technol. Commun. Control. Environ. Manag. HNICEM 2015, no., 2016.

  14. Thammasan, N., Application of Deep Belief Networks in EEG-based Dynamic Music-emotion Recognition. pp. 881–888, 2016.

  15. Khosrowabadi, R., Wahab, A., and Ang, K. K., From musical and vocal stimuli. Heart:1590–1594, 2009.

  16. Kumar, N., Khaund, K., and Hazarika, S. M., Bispectral analysis of EEG for emotion recognition. Procedia Comput. Sci. 84:31–35, 2016.

    Article  Google Scholar 

  17. Bhatti, A. M., Majid, M., Anwar, S. M., and Khan, B., Human emotion recognition and analysis in response to audio music using brain signals. Comput. Hum. Behav. 65:267–275, 2016.

    Article  Google Scholar 

  18. Chanel, G., Kierkels, J. J. M., Soleymani, M., and Pun, T., Short-term emotion assessment in a recall paradigm. Int. J. Hum. Comput. Stud. 67(8):607–627, 2009.

    Article  Google Scholar 

  19. Lin, Y. P. et al., EEG-based emotion recognition in music listening. IEEE Trans. Biomed. Eng. 57(7):1798–1806, 2010.

    Article  PubMed  Google Scholar 

  20. Gupta, R., ur Rehman Laghari, K., and Falk, T. H., Relevance vector classifier decision fusion and EEG graph-theoretic features for automatic affective state characterization. Neurocomputing 174:875–884, 2016.

    Article  Google Scholar 

  21. Hadjidimitriou, S. K., and Hadjileontiadis, L. J., Toward an EEG-based recognition of music liking using time-frequency analysis. IEEE Trans. Biomed. Eng. 59(12):3498–3510, 2012.

    Article  PubMed  Google Scholar 

  22. Hadjidimitriou, S. K., and Hadjileontiadis, L. J., EEG-Based classification of music appraisal responses using time-frequency analysis and familiarity ratings. IEEE Trans. Affect. Comput. 4(2):161–172, 2013.

    Article  Google Scholar 

  23. Yu, G., and Chan, K. C. C., What Strikes the Strings of Your Heart?–Multi-Label Dimensionality Reduction for Music Emotion Analysis via Brain Imaging. IEEE Trans. Auton. Ment. Dev. 7(3):176–188, 2015.

    Article  Google Scholar 

  24. Lin, Y., and Jung, T.-P., Exploring day-to-day variability in EEG-based emotion classification. IEEE Int. Conf. Syst. Man, Cybern., pp. 2226–2229, 2014.

  25. Murugappan, M., Human emotion classification using wavelet transform and KNN. vol. 1, pp. 148–153, 2011.

  26. Ito, S. I., Mitsukura, Y., Sato, K., Fujisawa, S., and Fukumi, M., A study on relationship between personal feature of EEG and human’s characteristic for BCI based on mental state. IECON Proc. (Industrial Electron. Conf., pp. 4229–4232, 2009.

  27. Gawali, B. W., Rao, S., Abhang, P., Rokade, P., and Mehrotra, S. C., Classification of Eeg Signals for Different Emotional. Commun. Comput. (ARTCom2012), Fourth Int. Conf. Adv. Recent Technol., pp. 177–181, 2012.

  28. Rahnuma, K. S., Wahab, A., Kamaruddin, N., and Majid, H., EEG analysis for understanding stress based on affective model basis function. Proc. Int. Symp. Consum. Electron. ISCE, pp. 592–597, 2011.

  29. Nawasalkar, R. K., EEG based stress recognition system based on Indian classical music. 2015.

  30. Sreedevi, M., Ajesh, A., Ajithnath, R., and Binu, L. S., A Study of Effect of Music Pitch Variation in EEG Using Factor Analysis and Neural Networks. 2009 2nd Int. Conf. Biomed. Eng. Informatics, pp. 9–11, 2009.

  31. Hsu, J.-L., Zhen, Y.-L., Lin, T.-C., and Chiu, Y.-S., Personalized music emotion recognition using electroencephalography (EEG). 2014 IEEE Int. Symp. Multimed., pp. 277–278, 2014.

  32. Zhang, F., Meng, H., and Li, M., Emotion extraction and recognition from music. 2016 12th Int. Conf. Nat. Comput. Fuzzy Syst. Knowl. Discov. ICNC-FSKD 2016, pp. 1728–1733, 2016.

  33. Tseng, K. C., Lin, B. S., Han, C. M., and Wang, P. S., Emotion recognition of EEG underlying favourite music by support vector machine. ICOT 2013 - 1st Int. Conf. Orange Technol., pp. 155–158, 2013.

  34. Bajaj, V. and Pachori, R. B., Human Emotion Classification from EEG Signals Using Multiwavelet Transform. 2014 Int. Conf. Med. Biometrics, no. Md, pp. 125–130, 2014.

  35. Jang, D., Park, Y. J., Shin, S., Lee, J., Jang, S. J., and Lim, T. B., Research about relation of music preference and brain-wave. Int. Conf. Inf. Netw. 2015:466–467, 2015.

    Google Scholar 

  36. Wang, S., Zhu, Y., Yue, L., and Ji, Q., Emotion recognition with the help of privileged information. IEEE Trans. Auton. Ment. Dev. 7(3):189–200, 2015.

    Article  Google Scholar 

  37. Islam, M., Ahmad, M., and Yusuf, M. S. U., An approach to estimate cognitive state with the impact of listening music on brain activity. 2nd Int. Conf. Electr. Inf. Commun. Technol. EICT 2015, no. Eict, pp. 152–157, 2016.

  38. Daimi, S. N., and Saha, G., Classification of emotions induced by music videos and correlation with participants’ rating. Expert Syst. Appl. 41(13):6057–6065, 2014.

    Article  Google Scholar 

  39. Shahabi, H., and Moghimi, S., Toward automatic detection of brain responses to emotional music through analysis of EEG effective connectivity. Comput. Hum. Behav. 58:231–239, 2016.

    Article  Google Scholar 

  40. Daly, I. et al., Affective brain–computer music interfacing. J. Neural Eng. 13(4):46022, 2016.

    Article  Google Scholar 

  41. Jatupaiboon, N., Pan-Ngum, S., and Israsena, P., Subject-Dependent and Subject-Independent Emotion Classification Using Unimodal and Multimodal Physiological Signals. J. Med. Imaging Heal. Informatics 5(5):1020–1027, 2015.

    Article  Google Scholar 

  42. Jatupaiboon, N., Pan-Ngum, S., and Israsena, P., Real-time EEG-based happiness detection system. Sci. World J., vol. 2013, 2013.

  43. Lin, Y. P., Wang, C. H., Wu, T. L., Jeng, S. K., and Chen, J. H., Support vector machine for EEG signal classification during listening to emotional music. Proc. 2008 IEEE 10th Work. Multimed. Signal Process. MMSP 2008, pp. 127–130, 2008.

  44. Ramirez, R., Palencia-Lefler, M., Giraldo, S., and Vamvakousis, Z., Musical neurofeedback for treating depression in elderly people. Front. Neurosci. 9:1–10, 2015.

    Article  Google Scholar 

  45. Sourina, O., Liu, Y., and Nguyen, M. K., Real-time EEG-based emotion recognition for music therapy. J. Multimodal User Interfaces 5(1–2):27–35, 2012.

    Article  Google Scholar 

  46. Hasminda-Hassan, Z., Murat, H., Ross, V., Mohd-Zain, Z., and Buniyamin, N., Enhancing learning using music to achieve a balanced brain. 2011 3rd Int. Congr. Eng. Educ. Rethink. Eng. Educ. W. Forward, ICEED 2011, pp. 66–70, 2011.

  47. Mohd Aris, S. A., Sulaiman, N., Abdul Hamid, N. H., and Taib, M. N., Initial investigation on alpha asymmetry during listening to therapy music, Proc. - CSPA 2010 2010 6th Int. Colloq. Signal Process. Its Appl., pp. 255–258, 2010.

  48. Daly, I. et al., Neural correlates of emotional responses to music: An EEG study. Neurosci. Lett. 573:52–57, 2014.

    Article  PubMed  CAS  Google Scholar 

  49. Farzaneh, P., Afsaneh, M., Reza, R., and Masood, N., Source localization of the effects of Persian classical music forms on the brain waves by QEEG. Procedia Soc. Behav. Sci. 5(2):770–773, 2010.

    Google Scholar 

  50. Lin, Y. P., Duann, J. R., Chen, J. H., and Jung, T. P., Electroencephalographic dynamics of musical emotion perception revealed by independent spectral components. Neuroreport 21(6):410–415, 2010.

    Article  PubMed  Google Scholar 

  51. Adamos, D. A., Dimitriadis, S. I., and Laskaris, N. A., Towards the bio-personalization of music recommendation systems: A single-sensor EEG biomarker of subjective music preference. Inf. Sci. (Ny). 343–344:94–108, 2016.

    Article  Google Scholar 

  52. Lin, Y., Duann, J., Feng, W., Chen, J., and Jung, T., Revealing spatio-spectral electroencephalographic dynamics of musical mode and tempo perception by independent component analysis. J. Neuroeng. Rehabil. 11:1–11, 2014.

    Article  CAS  Google Scholar 

  53. Poikonen, H., Alluri, V., Brattico, E., Lartillot, O., Tervaniemi, M., and Huotilainen, M., Event-related brain responses while listening to entire pieces of music. Neuroscience 312:58–73, 2016.

    Article  PubMed  CAS  Google Scholar 

  54. Rogenmoser, L., Zollinger, N., Elmer, S., and Jäncke, L., Independent component processes underlying emotions during natural music listening. Soc. Cogn. Affect. Neurosci. 11(9):1428–1439, 2016.

    Article  PubMed  PubMed Central  Google Scholar 

  55. F. Fikejz, Influence of compressed music bit rate on human electroencephalogram. pp. 1–4, 2012.

  56. Hassan, H., Murat, Z. H., Ross, V., and Buniyamin, N., A preliminary study on the effects of music on human brainwaves. 2012 Int. Conf. Control. Autom. Inf. Sci., pp. 176–180, 2012.

  57. Sanyal, S., et al., Detrended fluctuation and power spectral analysis of alpha and delta EEG brain rhythms to study music elicited emotion. Proc. 2015 Int. Conf. Signal Process. Comput. Control. ISPCC 2015, pp. 205–210, 2016.

  58. Flores-Gutiérrez, E. O. et al., Metabolic and electric brain patterns during pleasant and unpleasant emotions induced by music masterpieces. Int. J. Psychophysiol. 65(1):69–84, 2007.

    Article  PubMed  Google Scholar 

  59. Wu, J., Zhang, J., Ding, X., Li, R., and Zhou, C., The effects of music on brain functional networks: A network analysis. Neuroscience 250:49–59, 2013.

    Article  PubMed  CAS  Google Scholar 

  60. Lee, Y. Y., See, A. R., Chen, S. C., and Liang, C. K., Effect of Music Listening on Frontal EEG Asymmetry. Appl. Mech. Mater. 311:502–506, 2013.

    Article  Google Scholar 

  61. Mikutta, C., Altorfer, A., Strik, W., and Koenig, T., Emotions, arousal, and frontal alpha rhythm asymmetry during beethoven’s 5th symphony. Brain Topogr. 25(4):423–430, 2012.

    Article  PubMed  Google Scholar 

  62. Maity, A. K., Pratihar, R., Agrawal, V., Mitra, A., and Dey, S., Multifractal detrended fluctuation analysis of the music induced EEG signals. pp. 252–257, 2015.

  63. Uma, M. and Sridhar, S. S., A feasibility study for developing an emotional control system through brain computer interface. 2013 Int. Conf. Hum. Comput. Interact., pp. 1–6, 2013.

  64. Hatamikia, S. and Nasrabadi, A. M., Recognition of emotional states induced by music videos based on nonlinear feature extraction and SOM classification. 2014 21st Iran. Conf. Biomed. Eng. ICBME 2014, no. Icbme, pp. 333–337, 2011.

  65. Zhang, Y., Ji, X., and Zhang, S., An approach to EEG-based emotion recognition using combined feature extraction method. Neurosci. Lett. 633:152–157, 2016.

    Article  PubMed  CAS  Google Scholar 

  66. Uzun, S. S., Yildirim, S., and Yildirim, E., Emotion primitives estimation from EEG signals using Hilbert Huang Transform. Proc. - IEEE-EMBS Int. Conf. Biomed. Heal. Informatics Glob. Gd. Chall. Heal. Informatics, BHI 2012 1(Bhi):224–227, 2012.

    Article  Google Scholar 

  67. Ma, X., and Yang, F., An empirical study on interest point ranking and valence-arousal tags of EEG data. 2015 8th Int. Symp. Comput. Intell. Des., pp. 499–502, 2015.

  68. Ogawa, T., Ito, S., Mitsukura, Y., Fukumi, M., and Akamatsua, N., Feature extraction from eeg patterns in music listening. Ieee Ispacs 2004:17–21, 2004.

    Google Scholar 

  69. Daly, I. et al., Music-induced emotions can be predicted from a combination of brain activity and acoustic features. Brain Cogn. 101:1–11, 2015.

    Article  PubMed  Google Scholar 

  70. Nakamura, S., Sadato, N., Oohashi, T., Nishina, E., Fuwamoto, Y., and Yonekura, Y., Analysis of music-brain interaction with simultaneous measurement of regional cerebral blood flow and electroencephalogram beta rhythm in human subjects. Neurosci. Lett. 275(3):222–226, 1999.

    Article  PubMed  CAS  Google Scholar 

  71. Banerjee, A. et al., Study on Brain Dynamics by Non Linear Analysis of Music Induced EEG Signals. Phys. A Stat. Mech. its Appl. 444:110–120, 2016.

    Article  Google Scholar 

  72. Lin, L. C. et al., The long-term effect of listening to Mozart K.448 decreases epileptiform discharges in children with epilepsy. Epilepsy Behav. 21(4):420–424, 2011.

    Article  PubMed  Google Scholar 

  73. Lin, L. C. et al., Parasympathetic activation is involved in reducing epileptiform discharges when listening to Mozart music. Clin. Neurophysiol. 124(8):1528–1535, 2013.

    Article  PubMed  Google Scholar 

  74. Hoseingholizade, S., Golpaygani, M. R. H., and Monfared, A. S., Studying emotion through nonlinear processing of EEG. Procedia Soc. Behav. Sci. 32(2010):163–169, 2012.

    Article  Google Scholar 

  75. Jäncke, L., and Alahmadi, N., Detection of independent functional networks during music listening using electroencephalogram and {sLORETA-ICA.}. Neuroreport 27(6):455–461, 2016.

    Article  PubMed  Google Scholar 

  76. Marsella, P., Scorpecci, A., Vecchiato, G., Maglione, A. G., Colosimo, A., and Babiloni, F., Neuroelectrical imaging investigation of cortical activity during listening to music in prelingually deaf children with cochlear implants. Int. J. Pediatr. Otorhinolaryngol. 78(5):737–743, 2014.

    Article  PubMed  Google Scholar 

  77. Verrusio, W., Ettorre, E., Vicenzini, E., Vanacore, N., Cacciafesta, M., and Mecarelli, O., The Mozart Effect: A quantitative EEG study. Conscious. Cogn. 35:150–155, 2015.

    Article  PubMed  Google Scholar 

  78. Kwon, M., Gang, M., and Oh, K., Effect of the group music therapy on brain wave, behavior, and cognitive function among patients with chronic schizophrenia. Asian Nurs. Res. (Korean. Soc. Nurs. Sci). 7(4):168–174, 2013.

    Article  PubMed  Google Scholar 

  79. Akdemir Akar, S., Kara, S., Agambayev, S., and Bilgiç, V., Nonlinear analysis of EEGs of patients with major depression during different emotional states. Comput. Biol. Med. 67:49–60, 2015.

    Article  PubMed  Google Scholar 

  80. O’Kelly, J., James, L., Palaniappan, R., Taborin, J., Fachner, J., and Magee, W. L. L., Neurophysiological and behavioral responses to music therapy in vegetative and minimally conscious States. Front. Hum. Neurosci. 7:884, 2013.

    Article  PubMed  PubMed Central  Google Scholar 

  81. Erkkilä, J., Gold, C., Fachner, J., Ala-Ruona, E., Punkanen, M., and Vanhala, M., The effect of improvisational music therapy on the treatment of depression: protocol for a randomised controlled trial. BMC Psychiatry 8(1):50, 2008.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Lense, M. D., Gordon, R. L., Key, A. P. F., and Dykens, E. M., Neural correlates of cross-modal affective priming by music in williams syndrome. Soc. Cogn. Affect. Neurosci. 9(4):529–537, 2014.

    Article  PubMed  Google Scholar 

  83. Tan, L. F., Dienes, Z., Jansari, A., and Goh, S. Y., Effect of mindfulness meditation on brain-computer interface performance. Conscious. Cogn. 23(1):12–21, 2014.

    Article  PubMed  Google Scholar 

  84. Jaušovec, N., Jaušovec, K., and Gerlič, I., The influence of Mozart’s music on brain activity in the process of learning. Clin. Neurophysiol. 117(12):2703–2714, 2006.

    Article  PubMed  Google Scholar 

  85. L. L. Chen, B. Wang, and J. Z. Zoul, “Effect Evaluation of Relaxation Training Based on Nonlinear Parameters of,” pp. 2–5.

  86. Poikonen, H., Toiviainen, P., and Tervaniemi, M., Early auditory processing in musicians and dancers during a contemporary dance piece. Nat. Publ. Gr. 35:1–11, 2016.

    Google Scholar 

  87. Rigoulot, S., Pell, M. D., and Armony, J. L., Time course of the influence of musical expertise on the processing of vocal and musical sounds. Neuroscience 290:175–184, 2015.

    Article  PubMed  CAS  Google Scholar 

  88. Fikejz, F., Influence of music on human electroenc ephalogram. Appl. Electron., pp. 1–4, 2011.

  89. Baumgartner, T., Esslen, M., and Jäncke, L., From emotion perception to emotion experience: Emotions evoked by pictures and classical music. Int. J. Psychophysiol. 60(1):34–43, 2006.

    Article  PubMed  Google Scholar 

  90. Ghazal, R. S. S. A., Kadir M. H., Murat, Z. H., Taib, M. N., Rahman, H. A., Aris, S. A. M., The preliminary study on the effect ofnasyid music and rock music on brainwave signal using EEG. Eng. Educ. (ICEED), 2010 2nd Int. Congr., pp. 58–63, 2010.

  91. Naraballobh, J., Thanapatay, D., Chinrungrueng, J., and Nishihara, A., Effect of auditory stimulus in EEG signal using a Brain-Computer Interface. ECTI-CON 2015–2015 12th Int. Conf. Electr. Eng. Comput. Telecommun. Inf. Technol., 2015.

  92. Naraballobh, J. and Thanapatay, D., EEG-based analysis of auditory stimulus in a brain-computer interface. 2015 6th Int. Conf. Inf. Commun. Technol. Embed. Syst. EEG-Based, pp. 6–9, 2015.

  93. Al-Galal, S. A. Y., Alshaikhli, I. F. T., Rahman, A. W. B. A., and Dzulkifli, M. A., EEG-based emotion recognition while listening to quran recitation compared with relaxing music using valence-arousal model. Proc. - 2015 4th Int. Conf. Adv. Comput. Sci. Appl. Technol. ACSAT 2015, pp. 245–250, 2016.

  94. Lin, Y. P., Jung, T. P., and Chen, J. H., EEG dynamics during music appreciation. Proc. 31st Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. Eng. Futur. Biomed. EMBC 2009, pp. 5316–5319, 2009.

  95. Ito, S. I., Mitsukura, Y., Fukumi, M., and Cao, J., Method for detecting music to match the user’s mood in prefrontal cortex electroencephalogram activity based on individual characteristics. Conf. Proc. - IEEE Int. Conf. Syst. Man Cybern., pp. 2640–2644, 2007.

  96. Unehara, M., Yamada, K., and Shimada, T., Subjective evaluation of music with brain wave analysis for interactive music composition by IEC. 2014 Jt. 7th Int. Conf. Soft Comput. Intell. Syst. 15th Int. Symp. Adv. Intell. Syst., pp. 66–70, 2014.

  97. Li, Q., Yang, Z., Liu, S., Dai, Z., and Liu, Y., The Study of Emotion Recognition from Physiological Signals. Seventh Int. Conf. Adv. Comput. Intell., pp. 378–382, 2015.

  98. Jia-wei, S. and Wen, C. S., A study on non-invasive brainwave optimization.

  99. Kroupi, E., Vesin, J. M., and Ebrahimi, T., Phase-amplitude coupling between EEG and EDA while experiencing multimedia content,” Proc. - 2013 Hum. Assoc. Conf. Affect. Comput. Intell. Interact. ACII 2013, pp. 865–870, 2013.

  100. Sammler, D., Grigutsch, M., Fritz, T., and Koelsch, S., Music and emotion: Electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology 44(2):293–304, 2007.

    Article  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. A. Zaidan.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Highlights

• A coherent taxonomy for music-elicited changes in emotion and EEG.

• Motivations of music-elicited changes in emotion and EEG.

• Open challenges that hinder the utility of music-elicited changes in emotion and EEG.

• Recommendations lists to improve the acceptance of music-elicited changes in emotion and EEG.

• Focus on methods used in this literature.

This article is part of the Topical Collection on Image & Signal Processing

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hamada, M., Zaidan, B.B. & Zaidan, A.A. A Systematic Review for Human EEG Brain Signals Based Emotion Classification, Feature Extraction, Brain Condition, Group Comparison. J Med Syst 42, 162 (2018). https://doi.org/10.1007/s10916-018-1020-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10916-018-1020-8

Keywords

Navigation