Next Article in Journal
A Transplantable Frequency Selective Metasurface for High-Order Harmonic Suppression
Previous Article in Journal
Targeted Molecular Imaging of Pancreatic Cancer with a Miniature Endoscope
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review

1
College of Computer and Information Sciences, Imam Muhammad bin Saud University, Riyadh 11432, Saudi Arabia
2
College of Computer and Information Sciences, King Saud University, Riyadh 11543, Saudi Arabia
3
Center for Complex Engineering Systems at KACST and MIT, King Abdulaziz City for Science and Technology, Riyadh 11442, Saudi Arabia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2017, 7(12), 1239; https://doi.org/10.3390/app7121239
Submission received: 30 September 2017 / Revised: 26 November 2017 / Accepted: 28 November 2017 / Published: 1 December 2017

Abstract

:
Recent developments and studies in brain-computer interface (BCI) technologies have facilitated emotion detection and classification. Many BCI studies have sought to investigate, detect, and recognize participants’ emotional affective states. The applied domains for these studies are varied, and include such fields as communication, education, entertainment, and medicine. To understand trends in electroencephalography (EEG)-based emotion recognition system research and to provide practitioners and researchers with insights into and future directions for emotion recognition systems, this study set out to review published articles on emotion detection, recognition, and classification. The study also reviews current and future trends and discusses how these trends may impact researchers and practitioners alike. We reviewed 285 articles, of which 160 were refereed journal articles that were published since the inception of affective computing research. The articles were classified based on a scheme consisting of two categories: research orientation and domains/applications. Our results show considerable growth of EEG-based emotion detection journal publications. This growth reflects an increased research interest in EEG-based emotion detection as a salient and legitimate research area. Such factors as the proliferation of wireless EEG devices, advances in computational intelligence techniques, and machine learning spurred this growth.

1. Introduction

A Brain Computer Interface (BCI) is a system that takes a biosignal, measured from a person, and predicts (in real-time) certain aspects of the person’s cognitive state [1,2]. At the outset, BCIs started as assistive technological solutions for individuals with significant speech anomalies. However, the research was rooted in a subject’s desire to communicate through either speech or writing or to control his or her immediate environment. BCI systems have also used computer-based recreational activities to stimulate a subject’s innate ability to overcome physical disabilities.
Today, BCI-based research has been expanded to include people with and without physical disabilities. The entire system underscores how adaptive systems can enhance analytic methods and application areas. This assistive ability has created a widespread awareness among potential users and researchers alike. In the past 15 years, the increasing numbers of BCI research groups, peer-reviewed journals, conference abstracts, and attendees at relevant conferences are indicators of the rapid growth of interest in this field. Aside from these indicators, numerous companies are collaborating with research groups to develop BCI-related technologies. These companies have further defined clear roadmaps for BCI-related technologies.
There are a number of annual conferences in human-computer interaction (HCI) and BCI fields, which bring together prominent researchers to present their research projects, such as the ACM Conference on Human Factors in Computing Systems (CHI), the ACM International Conference on Multimodal Interaction (ICMI), the International Conference on Intelligent User Interfaces (IUI), Computer Supported Cooperative Work (CSCW), and the IEEE/ACM International Conference on Computer-Aided Design (ICCAD). These conferences hold many Workshops, case studies, and courses that are conducted by industry experts, practitioners, and researchers.
The phenomenal growth of BCI research is aligned with an influx of researchers from diverse disciplines, including clinical neurology and neurosurgery, rehabilitative engineering, neurobiology, engineering, psychology, computer science, mathematics, medical physics, and biomedical engineering. The interdisciplinary nature of BCI research has resulted in the development of BCI systems with different target applications.
Research on BCIs has shown that brain activity can be used as an active or passive control modality [3]. In an active BCI, the user controls a device using brain signals and patterns through a direct and conscious generation of commands that are wired to external applications. In contrast, passive BCIs are systems wherein brain signals yield outputs without any voluntary control. Emotional states, such as levels of meditation, engagement, frustration, excitement, and stress, are examples of affective and cognitive feedback in passive BCIs.
The concept of passive BCI has been applied in various fields, such as affective computing, which aims to improve the communication between individuals and machines by recognizing human emotions, and to develop applications that adapt to changes in user state, and thereby enrich the interaction. The entire communication process enriches the interaction, leading to a natural and effective user experience. Motivated by a new understanding of brain functions and advances in computer interface devices, countless research studies on emotion detection in real-time procedures for patients and clinicians have been undertaken (e.g., [4,5,6]). Furthermore, similar advancements are under development for additional cognitive mental states, such as attention and workload. These cognitive mental states correspond to affective states. These and many other advances in BCI technologies have piqued scientific interest in BCI technology and its application in different contexts.
According to the Gartner’s 2016 Hype Cycle report on trending research topics, both Brain-Computer Interface and Affective Computing are at the Innovation Trigger stage. These researchers predict that mainstream adoption will occur in 5–10 years for Affective Computing, and in more than 10 years for BCI research. This phenomenon is captured in Figure 1.
The volume of studies, research, and publications on BCI emotion-based recognition systems has surged in recent years. A plethora of studies with varied research methodologies has led to a broad range of results, depending on the datasets, recording protocol, emotion elicitation technique, detected features, temporal window, classifiers, involved modality, number of participants, and emotion models.
While BCI research encompasses a wide spectrum of applied domains, our research interest is specifically focused on electroencephalography (EEG)-based emotion detection, although objective measures of emotion can also be acquired from physiological cues derived from the physiology theories of emotion. Instruments that measure blood pressure responses, skin responses, pupillary responses, brain waves, heart responses, facial recognition, speech, and posture are often used as objective measures in affective computing [7]. This review will seek to understand EEG-based emotion recognition trends. The review will examine published literature with the aim of providing insights for future emotion recognition systems to practitioners and researchers.
During our literature review, we observed that articles on emotion recognition cut across various disciplines, including clinical neurology, rehabilitation, neurobiology, engineering, psychology, computer science, medical physics, and biomedical engineering. Hence, conducting a comparative analysis of articles is difficult since different journals and different scientific domains have different research focuses and methodologies.
Accordingly, the main objective of this review is to classify and summarize research that is relevant to emotion recognition systems and to provide conceptual frameworks for integrating and classifying emotion recognition articles. This system of classification will be useful for literature reviews on emotion recognition research.
The following sections illustrate our proposed classification framework for emotion recognition literature reviews based on research articles. In Section 2, we outline our research methodology. Section 3 describes our proposed classification framework for emotion-recognition-based literature reviews. Section 4 presents our discussion. In Section 5, we provide insights for future research and discuss the challenges and trends in EEG-based emotion recognition. Finally, in Section 6, we present the study’s conclusions.

2. Research Methodology

Articles on emotion recognition systems are scattered across journals of various disciplines and were found in both medical and non-medical journal publications, including clinical neurology, rehabilitation engineering, neurobiology, engineering, psychology, computer science, medical physics, and biomedical engineering.
We searched Web of Science (WoS), https://webofknowledge.com, to obtain a comprehensive bibliography of the academic literature on emotion-recognition-based BCI. The Web of Science Core Collection database provides us with quick, powerful access to the world's leading citation databases, such as Science Direct (Elsevier), IEEE/IEE Library, ACM Digital Library, Springer Link Online Libraries, and Taylor & Francis.
The following subsections describe the procedure we followed in extracting articles, along with our article selection criteria and filtering processes.

2.1. Data Sources and Procedures for the Extraction of Articles

We searched our selected database (Web of Science Core Collection database) for articles over a span of eleven years, 2005–2016. We used Basic Search to look for topics that fall into our research scope. Using basic search settings, we input search terms and phrases, such as: affective or emotion; emotion detection or recognition; EEG or Electroencephalography; and, Brain-computer interface, Passive BCI or BCI. According to Web of Science search result templates, auto-generated search terms are a result of searches covering articles, meeting abstracts, book chapter(s), and proceedings papers.
The initial search resulted in 766 articles. Table A1 in Appendix A shows a broad categorization and distribution of our search results. We further refined our search based on some predetermined criteria. The following section illustrates our selection criteria.

2.2. Selection Criteria

Three criteria were used to select and accept emotion recognition articles for further review. Articles were excluded if they did not meet the following selection criteria:
  • Articles must address one of the Gartner Hype Cycle 2016 trending research topics. To meet this criterion, they must be relatively current. In this regard, we chose articles that were published between 2005 and 2016. This 11-year period could be considered to correspond to the main research period of emotion recognition systems.
  • We excluded meeting abstracts, book chapters, conference proceedings, workshop descriptions, masters and doctoral dissertations, and non-English articles. Notably, the number of conference papers in this domain was 322, the number of book chapters was 3, and the number of meeting abstracts was 6.
  • We also ensured that only peer-reviewed journal articles were included. The logic behind this is that practitioners and academics frequently use journals to both obtain and spread research findings. Thus, journal articles contain the highest level of research.

2.3. Filtering/Reviewing Process

In this step, each article was manually reviewed in three rounds. This was done to eliminate non-emotion-recognition-based works and non-BCI-based works. After that step, we classified the filtered articles according to our classification scheme.
During our first round of review, we excluded articles based on our predetermined selection criteria. We excluded all articles that did not address EEG-based emotion recognition systems. After applying our selection criteria, we were left with 435 articles. We directly imported all of the articles into an online EndNote database that facilitated, managed, assessed, and reviewed our result articles. The remaining articles were then scanned and filtered, as described in the next subsection.
Our second round of review involved the manual scanning of titles, abstracts, authors, keywords, and conclusions. This round of review excluded articles in which the central theme of the discussion was centered on subjects other than emotion recognition systems, with emotion recognition systems only being mentioned in passing. By the end of this round, we were left with 340 articles.
The final round involved reading the full texts and analyzing each article according to our classification scheme. This scheme is described in the next section. The infographic in Figure 2 shows the procedure used to filter and extract articles that meet our predetermined criteria.

3. Classification Method

To systematically reveal and examine research insights on EEG-based emotion recognition, a literature classification scheme was developed. This classification scheme was based on categorizing the research focuses of the 285 selected articles that remained after the filtering processes. A graphical representation of these categories and subcategories and their relationships is presented in Figure 3.

3.1. Application Domain and Field

EEG-based emotion detection applications can be categorized into two broad domains: Medical and Non-medical. They are further described in the following subsections.

3.1.1. Medical Context

EEG emotion recognition is medically contextual if the system is designed to provide assistance, enhancement, monitoring, assessment, and diagnosis of human psychiatric and neurological diseases.

3.1.2. Non-Medical Context

Includes emotion recognition systems based on EEG signals that are designed to entertain, educate, monitor, or play games.

4. Results and Discussion

We extracted 285 articles on emotion recognition systems from 29 online databases with 160 different journals. Each article was reviewed and classified according to our classification scheme. Although the extensity of our search was limited, it offers a comprehensive insight into EEG-based emotion recognition system research. Furthermore, we provided a descriptive overview of temporal trends of these types of publications, along with a description of publication domains (e.g., publication research area, online database and journal) in Appendix B.

4.1. Classification of Articles by Paper Orientation

When we classified journal articles according to paper orientation, we found 20 review articles and 265 experimental design innovation articles.

4.1.1. Review Paper

We found twenty review papers. Seventeen of the articles were narrative reviews, and three were systematic reviews. We classify these papers according to their focus: general/background, signal processing, classification, application, and others. We found seven general reviews, including neuroimaging techniques, emotion modeling, and applications. We also found three signal processing reviews, one classification method, seven application reviews, and two reviews that covered the training protocols, validity in the EEG-neurofeedback optimal performance field, and evidence of neurofeedback learning. Table 1 shows the review article classifications.
The objectives of the earliest reviews (2007 and 2008) were to understand the mirror neurons; cover the EEG correlates of emotions, brain rhythms, and functions; and, present a framework and suggestions for a research program [3,12]. We also found two published review articles that aim mainly to highlight an emerging field, such as the neuroscience of culture in [9] and social neuroscience in [10].
Other review articles were published in medical fields. These articles focused on EEG applications and how to use EEGs to diagnose and assess medical conditions. These articles also explored the relationships between symptoms and affective states, such as schizophrenia [17,18], depression [20], disorders of consciousness [19], and autism [21].

4.1.2. Design Innovation (Experimental) Paper

A number of design innovation papers (265) were published in the period of 2005–2016. Technical aspects of design and design implications are important, and there are still many technological challenges and obstacles to the growth of EEG-based emotion detection systems. Therefore, information was extracted from each study on the following aspects: affective states investigated, emotion elicitation method, number of participants, acquisition technique, and emotion recognition techniques, including feature extraction, type of classifier, performance of the classifier, and online (real-time) vs. offline classification.
Emotion model: Emotions are traditionally classified on the basis of two models: the discrete and dimensional models of emotion. Dimensional models of emotion propose that emotional states can be accurately represented as combinations of several psychological dimensions. Most dimensional models incorporate valence and arousal. Valence refers to the degree of ‘pleasantness’ that is associated with an emotion, whereas arousal refers to the strength of the experienced emotion. Discrete theories of emotion propose the existence of small numbers of separate emotions, as characterized by coordinated response patterns in physiology, neural anatomy, and morphological expressions. Six basic emotions that are frequently specified in research papers are happiness, sadness, anger, disgust, fear, and surprise.
Another emotion model is the appraisal model. The appraisal model of emotion is based on the evaluation of currently remembered or imagined circumstances. Basically, the appraisal model proposes that thought precedes emotion and that emotion precedes behavior. The majority of the papers (172 articles, 64.91%) clearly specify that they used a dimensional model, whereas (34 articles, 12.83%) used a discrete model; and, 21.89% of articles used a different model or did not specify which emotion model they used.
Emotion elicitation technique: Different emotion elicitation techniques have been developed and reported. Critical examples of emotion elicitation techniques include standardized emotional stimuli (e.g., pictures, films, and audio), imagination techniques (e.g., guided imagery and autobiographic recall), present social interactions (e.g., games), and directed facial action tasks. We found that researchers have used different methods to elicit target emotions. In addition, other researchers have found that the use of multiple stimuli is more effective in eliciting emotions than the use of single stimuli. Table 2 shows the numbers of articles that used a combination of different emotion elicitation techniques.
When using elicitation techniques to evoke emotions, the emotional stimuli (image, audio, and video) are usually taken from reputable sources, such as the International Affective Picture System (IAPS) database and the International Affective Digitized Sounds (IADS). In addition, other databases and video clips can be collected from various resources on the internet (e.g., YouTube and Facebook). Other modalities, such as the recall paradigm, where the subject is asked to repeatedly recall emotional instances from their life, and dyadic interaction, where a facilitator helps to induce various emotions, are also used by researchers.
Although the affective information from image, video, and audio stimuli has been extensively studied, olfactory stimuli [26], written words [27,28,29,30,31,32], food stimuli (enriched by emotional stimuli) [33], and games have been used as elicitation methods in a number of studies as ways to assess human emotional state by investigating physiological signals [34,35,36,37].
Single/multimodality: Recently, many studies have shown that combinations of modalities can significantly enhance the emotion recognition accuracy in comparison with single modalities. For example, combining eye movements, which are measured using an eye tracking method, and EEG can considerably improve the performance of emotion recognition systems [38,39]. Moreover, in [40], the researchers proposed a multi-modal emotion recognition system using four sensing methods: EEG, heart inter-beat interval, galvanic skin response, and stressor level lever. The purpose of this study was to measure the human stress response during using a powered wheelchair. It also provided a comprehensive background for multi-modal emotional state and proposed a frame work and discussed the feasibility and utility of using the multi-modal approach. In [8], the researchers used multimodal physiological signals EEG, Galvanic skin response, blood volume pressure, respiration pattern, skin temperature, electromyogram, and electrooculogram to classify and predict depression. They proved the potential of the multimodal approach and achieved 85.46% accuracy using support vector machine (SVM). The work in [41] proposes an approach for multi-modal video-induced emotion recognition, based on facial expression and EEG technologies. In [42], the researchers proposed an emotion recognition system that used a large number of modalities: galvanic skin response, heart rate, blood volume pulse, electrocardiogram, respiration, facial electromyograph, and EEG, while the subjects were watching the affective eliciting materials to distinguish six different kinds of emotions (joy, surprise, disgust, grief, anger, and fear). They stated that the integration of these methods facilitates a more detailed assessment of human affective state, which also improved accuracy and robustness. In [43], the researchers examined the role of emotional arousal on subsequent memory in school-age children. They used EEG, heart rate, and respiration. Their findings endorsed the value of combining multiple methods to assess emotion and memory in development.
Most previous research into emotion recognition used either a single modality or multiple modalities of different physiological signals. The former method allows for the limited enhancement of accuracy, and the latter has the disadvantage that its performance can be affected by head or body movements. Furthermore, the latter causes inconvenience to the user due to the sensors that are attached to the body. In our review, we found that the majority of the papers (218 articles, 82.3%) used a single modality, namely, EEG signals, as an objective method in their studies, whereas (45 articles, 17%) used multiple modalities.
Recognition model: The analysis and classification of the EEG signal can be performed either online or offline. The performance time begins when features are extracted and ends when classification has been completed. The number of electrodes that are used during experimentation in emotion detection situations imposes time constraints on the algorithms. For example, in [44], the authors built a system that detects current user affective states and obtained a classification accuracy of 65%. In [45,46], they tested their method online and offline. We observed that the majority of the 187 articles mentioned that they were using offline analysis, whereas only 15 articles used online analysis.
EEG device: A number of EEG devices have been used in EEG-based BCI research in medical and academic settings. We found up to 48 different devices from different companies all over the world. In Table 3, we specify the most commercially available and widespread devices that have been used in more than five articles.
Electrodes: When considering the number of electrodes, time interval required to set up an EEG device, comfort level of subjects, system usability, and number of features to be processed, it is advised, from this standpoint, that fewer electrodes be utilized; for example, five channels were used in [176,190]. Nonetheless, most current EEG devices still require a relatively large number of electrodes; for example, 64 channels were used in [93,95], and 32 channels were used in [43,71,191]. We found that the maximum number of electrodes that were used when recording EEG signals was 257 in [141], whereas the minimum number was one in [192]. In this study, the electrode was placed at the Fpz according to the international 10–20 system, and two reference electrodes were located on the left and right ear lobes. Notably, other physiological signals were recorded in this study, including facial electromyogram, electroencephalography, skin conductivity, and respiration data.
Benchmark EEG emotional databases: Only a small number of benchmark emotional EEG databases with categorized emotions are publicly available for use and to test a new system. These datasets are available to the academic community via a web-based system and have been used in a number of research studies to test and explore proposed systems. Table 4 lists the benchmark datasets, along with descriptions and references to the articles that have used them. However, most of the reviewed articles (239 articles, 88.7%) recorded their own EEG signals and used them during their experiments.
Moreover, in [193], the authors evaluated their approach on three benchmark databases: DEAP, MAHNOB-HCI, and USTC-ERVS. USTC-ERVS is an EEG dataset that contains 197 EEG responses to 92 video stimuli from 28 users, along with users’ emotional self-assessments. However, the USTC-ERVS dataset is no longer available online. Either its page was moved or the URL was completely expunged from the initial database location.
Participants: The number of participants can vary based on the experiment type and field. In our review, we found that the minimum number of participants was one. In [137], an EEG investigation was carried out on a patient with complete cortical blindness who presented with affective blindness. The maximum number of participants was 300 in [42]. In this study, EEG-based physiological signals of Chinese college students were recorded to establish a large effective physiological signal database.
Moreover, different age groups ranging from infants [132,143,169], children [6,43,75,84,89,103,125,126,128,129,130,140,183,184,210,211,212,213], adolescents [59,65,70,135,139], and elderly [105,214] in different experiments were considered as study samples. Some studies were conducted on a single gender, such as those investigating women's emotions during motherhood and parental status [133,134,164] and a study of men [215]. Some studies targeted a specific type of subject, such as healthy people or patients, to investigate and observe the differences in emotions between two different groups, such as control and healthy groups in [76,107,135], groups of women and men in [46,97,177], young adults versus older adults in [87,216], or children versus adults in [126,143].
EEG correlates of emotion (signals): Numerous research studies have examined neural correlates of emotion in humans. Frontal EEG asymmetry, event-related desynchronization/synchronization, event-related potentials, and steady-state visually evoked potentials have been found to be associated with emotional states. We found that the majority of the 130 articles used event-related potentials, whereas 48 articles used Frontal EEG asymmetry in their analysis, six articles used event-related desynchronization/synchronization, and four articles used steady-state visually evoked potentials.
Emotion types and numbers: Human feelings and emotions can be quantified for various emotional states. However, only certain types of emotions can be recognized using EEG. Moreover, finding key emotional states to be recognized is mandatory; for example, six emotions were detected in [42,45,106,120,122,192], whereas in [108], a real-time EEG signal to classify happy and unhappy emotions was proposed, and in [113], a fear evaluation system was proposed. In our review, we found that most of the articles aim to detect unpleasant, pleasant, and neutral emotions, such as in [105,217], or positive, negative, and neutral emotions that are based on the valence-arousal dimensional emotion model, as in [159,206].
Computational methods to estimate affective state: Various techniques and approaches have been proposed in the literature in the processing steps to estimate the emotional state from the acquisition signals. Reviewing recent publications and comparing computation methods and results was conducted on a sample of the papers collected. The selection was based on the year 2015–2016 as a timeframe for publication to reflect recent trends and methodologies for emotion detection. The computational methods to extract and classify emotional features from EEG are summarized in Table 5.
It is noteworthy to mention that a single feature extraction technique is not optimal across all of the applications. Besides, existing signals are not enough for high accuracy feature extraction. Several approaches introduce more features in different analysis domains to capture extra information about the state of the brain [107,117,200,203,213,216,224]. Consequently, feature extraction is one of the major challenges in designing BCI systems; it is determined based on the features and on the appropriate transformation. Although the answer to what are the most emotion-relevant EEG features is still under investigation, power features from different frequency bands are still the most popular in the context of emotion recognition. Studies [26,197,218] have shown that power spectral density (PSD) extracted from EEG signals performs well on distinguishing affective states.
Several machine learning algorithms have been used as emotion classifiers, such as support vector machine (SVM), K-nearest neighbors (K-NN), linear discriminant analysis (LDA), random forest, Naïve Bayes (NB) and Artificial Neural Network (ANN). In general, therefore, the choice of which classification algorithm can be used when designing a BCI depends largely on both the type of brain signal being recoded and the type of application that is being controlled. However, SVM based on frequency domain features as power spectral density (PSD) is shown as the most commonly used method.
Recently, Deep learning methods have been applied to the EEG-based emotion recognition filed. In [49], they proposed a deep belief network (DBN) classifier to recognize three categories of emotions (positive, neutral, and negative). Their experimental results show that the DBN models obtain higher accuracy than SVM and K-NN methods. Also, recurrent neural networks have been used in [74,208,229]. In [206], Chai et al. proposed auto-encoder-based deep learning method to investigate emotion recognition on the SEED dataset.

4.2. Classification of Articles by Application Domain and Field

We further classified the 285 articles that we extracted into areas of real-life application. Notably, we classified them into medical and non-medical fields of application. The following criteria were used to classify the articles under the medical field:
  • the paper discussed a medical condition (disorder/disease), such as a psychiatric or neurological case;
  • the participants of the experiment were patients or it involved two groups: one consisting of healthy people, the other of patients;
  • the experiment was conducted in a clinical setting; and/or,
  • the paper was directed toward the medical community and suggested a new method for assistance, enhancement, monitoring, or diagnosis using emotion-based recognition.
After evaluating all 285 articles according to the medical classification criteria listed above, we found that 67 articles, or 23.51%, were medical articles. The remaining 218 articles, or 76.5%, were non-medical. Distributions of medical and non-medical articles according to year of publication are presented in Figure 4. We further classified the articles into areas of application within each domain. Figure 5 shows a treemap of the 285 articles, classified according to field and application domain.

4.2.1. Medical Applications

In the medical field, EEG-based emotion detection systems are used for assisting, monitoring, enhancing, or diagnosing patients’ affective states. These medically adapted EEG systems are also used to analyze different types of neurodevelopmental disorders. Some of these neurodevelopmental disorders affect the memory, emotion, learning ability, behavior, and communication of persons suffering from these conditions. The most common cases are Schizophrenia, Autism, Depression, Huntington’s disease (HD), and a myriad of psychiatric and neurological diseases.
Table 6 shows a classification of medical articles into areas of application, including Assessment (22 articles, 32.8%), Assistance (13 articles, 19.4%), Diagnosis (8 articles, 11.9%), Monitoring (20 articles, 29.9%), and Other (4 articles, 6%).
Generally, neurological studies seek to understand how defects in neurobiological processes result in problems associated with neural functioning. These studies are conducted with the goal of understanding how individual differences in brain structure and function influence affective, cognitive, and behavioral outcomes.
Different approaches have been proposed, and several research groups have developed EEG-based BCI systems that aim to detect these affective states. Examples of medical applications are hereunder identified.
Depression is a mental disorder that is related to a state of sadness and dejection. It affects the emotional and physical state of a person. Using EEG-based emotion recognition as a diagnostic test for depression produces conclusive results. A number of studies are based on the automated classification of normal and depression-related EEG signals. This proposed automatic classification system could serve as a useful diagnostic and monitoring tool for the detection of depression [20,79,133,135,226,228,231].
Persons with schizophrenia exhibit drab facial expressions and rarely show positive emotions. These abnormal personality traits may impact social functioning and communication. Whether these deficits reflect an aberrant sensory anomaly, an inability to retain information in their memory, or a dysfunctional integration of these two functions remains unclear. However, studies have shown that sensory processing and memory functioning may be affected in schizophrenic patients [5,17,18,54,62,66,72,73,90,94,98,180,233,236]. These studies’ experimental protocols were intended to clarify patient deficits in processing emotional faces.
Persons with Parkinson's disease have shown deficits in emotional recognition abilities. Research findings on the reason for this are inconclusive. Nine articles discussed the idea of using EEG-based emotion detection to provide assistance, monitoring, assessment, and diagnosis of Parkinson’s disease in patients [47,106,107,114,118,119,120,121,122].
EEG-based emotion detection systems for cochlear implant patients have been proposed in [156,211,238]. EEG signals are used to estimate the ability of these patients to recognize emotional sounds by using EEG signals and different stimuli. Three studies compared the emotional approaches of two groups of study participants. In one group, which was comprised of children with normal hearing, the participants displayed a withdrawal/approach model, whereas the cochlear implant users did not.
Other medical cases include autism [6,21,75,140,212,213], bipolar disorder [124,145], epilepsy [230,234,235], attention-deficit/hyperactivity disorder [103,165], bulimia nervosa [33,168], borderline personality disorder [167], pervasive developmental disorder [81], and eye movement desensitization and reprocessing [125].

4.2.2. Non-Medical Applications

EEGs have been utilized in numerous non-medical applications. These applications are employed for both healthy and physically challenged individuals. Non-medical fields where EEGs have been applied include entertainment, education, monitoring, and gaming. Table 7 shows a classification of non-medical articles into areas of application: Monitoring (95 articles, 43.6%), New method (60 articles, 27.5%), Entertainment (25 articles, 11.5%), Marketing (4 articles, 1.8%), Education (2 articles, 0.9%), Assistance (10 articles, 4.6%), and Other (22 articles, 6.4%).
Different approaches have been proposed, and several research groups have developed EEG-based BCI systems that aim to detect affective state. Examples of non-medical applications are hereafter identified.
Recently, a new research area appeared in the marketing field: neuro-marketing. The goal of this new area is to understand consumer responses toward marketing stimuli by using imaging techniques and the recognition of physiological parameters. Because customer feelings in sales areas are strongly influenced by the perception of the surroundings, recognition of emotional responses can reveal true consumer preferences and improve and assist in the (buying) process. Four articles have presented this idea in different applications [102,123,218,280].
Similarly, an EEG-based experimental study [144] is also used to identify the temporal point at which smokers’ responses to health warnings begin to differ. Basically, they aim to determine the effects of graphic pictorial cigarette package health warnings by assessing the selective attentional orientation and measuring emotional processing; they reported that smokers are less sensitive to the emotional content of cigarette health warnings. Therefore, future health warning development should focus on increasing the emotional salience of the pictorial health warning content among smokers.
Three research groups have explored EEG emotion detection systems for emotion (stress) monitoring during self-driving of a powered wheelchair [40,227,285]. These studies aim to investigate the ability to assist and enhance BCI-based wheelchairs by integrating emotion detection while controlling a wheelchair.
The relationship between music genres and human emotions has been investigated in several recent BCI studies. In these studies, brain signals were recorded using an EEG headset while the subject listens to music [44,53,58,100,110,112,115,116,151,154,190,205,216,220,222,235,276,279]. Moreover, the subjects’ emotions were recognized as displayed by EEG signals. These signals were then used to tag multimedia data [71,101,193,277]. These studies also investigated methods for implicit tagging, wherein users’ responses to interactions with the multimedia content are analyzed to generate descriptive tags [71].
Recent BCI research from different disciplines approaches EEG emotion detection and recognition via diverse methods. Some of these methods include feature extraction and selection, machine learning, and pattern recognition methods. These different methods are used to build EEG-based emotion recognition systems. Several research groups (63 articles, 21%) have proposed novel/effective methods for building improved EEG-based emotion recognition systems. Three articles have proposed and described EEG emotion database benchmarks [42,92,101].
Another non-medical application of EEG-based emotion recognition is word processing. Because individual emotional differences impact word processing, differences in interpreting a string of words may elicit different emotional responses. These varying emotional responses are caused by involuntary (implicit) semantic processing, lexical decision tasks (LDTs), and interpretations of perceived positive or negative emotional words [29,31,32,150,160].
Several research groups have explored EEG emotion detection systems as means for monitoring levels of attention and measuring workload [176,188]. One important use is to monitor the level of alertness in security-critical tasks such as driving and surveillance. For example, the Air Force Multi-Attribute Task Battery was used as a realistic, ecologically valid multitask environment, in which a participant's workload could be varied [188].
Recently, EEG-based emotion recognition was proposed as a technique that can be used to support a classification task, such as the EEG-based emotional state clustering task [50], image classification [219], and Odor Pleasantness Classification Using Brain and Peripheral Signals [26].
Other non-medical applications of EEG-based emotion recognition include, lie detection applications, security-critical tasks, and driving. For instance, EEG emotion applications are used in untangling criminal cases in legal proceedings and to tell whether an individual is telling the truth [97,139,274,284].
Two articles discussed how EEG-based emotion detection systems could be used to monitor the alertness of humans when performing security-critical tasks. The researchers proposed a real-time low-level attention detection application that can measure a driver’s degree of attention [23]. Likewise, a driver’s emotional and stress levels can be monitored under different conditions [23,283].

5. Challenges and Future Directions

One challenge with regard to the detection and modeling of emotions in the context of human-computer interaction (HCI), is that it remains complex and requires further exploration. In this context, future research on EEG-based emotion recognition will be explored using the BCI design. From our perspective, Figure 6 provides insights into future research, challenges, and trends in EEG-based emotion recognition.
In general, there are many challenges that are associated with the BCI system. These challenges can be classified as technology related and/or user related. There are differences between the two; however, it is noted that technology-related challenges include, impedance with sensors, system usability, and real-time constraints. When dealing with the device, there are other things to consider, such as the perceived obtrusiveness, information transfer rate, and high error rates. When dealing with user-related challenges, one needs to consider the unfamiliarity of the participating subjects or patients with BCI technologies, discrepancies between ratings, and the duration of setup and preparation. Testing requires assistance from the facilitator in applying the electrodes, which makes the training phase time-consuming.
The BCI design requires multidisciplinary skills from such fields as neuroscience, engineering, computer science, psychology and clinical rehabilitation to achieve the goal of developing an alternative communication and control medium. The number of EEG-based emotion recognition research studies in recent years has been increasing, yet EEG-based emotion recognition is still a new area of research. The effectiveness and efficiency of these algorithms are somewhat limited. Computational methods are being used to estimate emotional state; nevertheless, they can be further improved with technological advancements in order to increase the effectiveness of these algorithms. Some examples of limitations in current algorithms and approaches involve time constraints, accuracy, the number of electrodes, the number of recognized emotions, and benchmark EEG affective databases.
Another limitation is the accuracy and reliability of sensory interfacing and translation algorithms in BCI systems. These factors generally limit the usage of these technologies in clinical settings. Other fields have limitations and challenges as well. For example, engineering challenges focus on the low signal-to-noise ratio in noninvasive electroencephalography (EEG) signals. Moreover, computational challenges include the optimal placement of a reduced number of electrodes and the robustness of BCI algorithms to a smaller set of recording sites.
As we mentioned previously, the challenges currently facing EEG-based BCI systems are two-pronged: technological and user related. We anticipate that some of these challenges will be resolved in the future. For the convenience of description, we have categorized these challenges according to the time that we predict it will take to resolve them: near-term (2–5 years), mid-term (6–9 years), and long-term (10+ years). Although we have categorized these challenges, possible trends and future solutions into three time phases, it is worth noting that some of these anticipated future trends might come earlier, overshoot their time frame, be delayed, or never be achieved.
We anticipate that some of the challenges we have identified may be addressed in 2 to 5 years’ time. Some challenges that may be addressed include familiarity of BCI electronic gadgets to study participants or patients; the proliferation of mobile devices’ mood and emotion apps; increased multidisciplinary cooperation in BCI systems; and, formulating more accurate and precise definition of the emotion processing in human and neurophysiological studies in EEG correlates of emotion as noted by Kim et al. in [16].
In addition to the above-stated trends, we anticipate extensive advancement in body sensors, cameras, and head-mounted devices. Moreover, improving technology sensors is advancing at a rapid rate, as noted by Alotaiby et al., in [13]. We estimate that these will contribute to the development of rudimentary affective computing systems in approximately six to nine years’ time. In ten+ years’ time, we anticipate better machine learning and pattern recognition algorithms, in addition to improved accuracy, speed, and elimination of latency in BCI systems. Advances in computational methods have been noted by researchers to reach a reasonable rate of diffusion and serve as key elements for practical online emotion detection systems, as noted by Kim et al. in [16] and by Jenke et al. in [14].
Moreover, we believe that, there is potential for using hybrid approaches in classification and feature extraction methods. Another aspect is sharing datasets and making them accessible to researchers for further testing. Third, optimizing the number of electrodes is increasingly recognized as important for computation. Also, combining different modalities with EEG. Together, these contribute towards facilitating the work of future researchers in this domain.

6. Conclusions

Recent developments and studies in BCI technologies have facilitated emotion detection and classification. These BCI studies set out to investigate, detect, and recognize a participant’s emotional affective state. These studies also sought the application of research findings in varied contexts, including communication, education, entertainment, and medical settings. However, increasing numbers of BCI researchers are conducting novel experiments, such as considering different responses in various frequency bands, different ways of eliciting emotions, and various models of affective states. These various approaches have been employed in gauging the emotional states of participants/patients from BCI acquisition signals.
This study set out to review published articles on emotion detection, recognition, classification, and current and future trends. Thereafter, it sought to provide insight into how current and future trends will impact researchers and practitioners alike. To achieve this, we mined 29 online databases, selected 160 journals, and extracted 285 articles on emotion recognition systems. Each article was reviewed, analyzed, and categorized according to publication year, research methods, primary contribution, and publication outlet.
Our classification and descriptive review provides quality reference sources for academics, researchers, and practitioners that are working in the field of emotion detection and recognition. This study also contributes to our understanding of various applied concepts of emotion recognition using BCI in different contextual fields. Our results showed an explosive growth of the number of EEG-based emotion detection journal publications. This reflects increased research interest in EEG-based emotion detection as a salient and legitimate research area. Factors such as the proliferation of wireless EEG devices, advanced computational intelligence techniques, and machine learning, spurred this growth. In general, we expect exponential growth in the amount of EEG-based emotion detection research in the near future. As would be anticipated of any new research field, the field of EEG-based emotion detection and recognition is fraught with challenges. Our results show that the challenges raised herein may be resolved in the near future, thereby causing further growth and increased research interest in BCI systems.
As noted in Section 5, EEG-based emotion detection involves multiple dimensions that need to be considered in research methodologies. Researchers would need to take into account the usability of devices, accessibility of datasets, optimization of computational methods, and combining different modalities in the design of their EEG based studies.
Regarding computational methods that can be used in feature extraction and classification phases, it appears from our review that there is no particular feature extraction nor classification technique that emerges as the single best choice for all of the applications. The choice depends on the specific system paradigm and task.
It has been recommended to consider as many algorithms as possible to determine the validity of the proposed process, including preprocessing and synchronization. In most cases, one should compare performance with a range of features and techniques before settling on a choice that yields adequate performance for the given application.

Acknowledgments

This research project was supported by a grant from the “Research Center of the Female Scientific and Medical Colleges”, Deanship of Scientific Research, King Saud University. The authors thank Ms. Sarah Almojel and Ms. Kholod Alharthi for their assistance in collecting the data.

Author Contributions

Abeer Al-Nafjan conceived, designed, performed the systematic search of the literature, analyzed and interpreted the data, and drafted the manuscript. Areej Al-Wabil and Manar Hosny supervised the analysis, reviewed the manuscript, and contributed to the discussion. Yousef Al-Ohali advised on this study. All authors have read and approved the submitted version of the manuscript.

Conflicts of interest

The authors declare no conflict of interest.

Appendix A

Table A1. Distribution of the results according to the Web of Science categorization.
Table A1. Distribution of the results according to the Web of Science categorization.
RangeField/DomainNumber of Publications
>100Neurosciences202
10–100Psychology Experimental64
Psychology62
Psychiatry44
Clinical Neurology43
Physiology40
Computer Science Artificial Intelligence37
Behavioral Sciences34
Psychology Biological25
Neuroimaging24
Radiology Nuclear Medicine Medical Imaging, Multidisciplinary Sciences23
Psychology Multidisciplinary18
Computer Science Cybernetics17
Psychology Developmental, Engineering Biomedical14
Engineering Electrical Electronic13
Computer Science Information Systems10
1–9Mathematical Computational Biology, Computer Science Interdisciplinary Applications9
Rehabilitation7
Psychology Clinical6
Robotics, Psychology Social, Pharmacology Pharmacy, Medical Informatics, Audiology Speech-Language Pathology5
Operations Research Management Science, Medicine Research Experimental, Linguistics, Ergonomics, Computer Science Theory Methods, Computer Science Software Engineering4
Public Environmental Occupational Health, Pediatrics, Instruments Instrumentation, Engineering Multidisciplinary, Education Special3
Telecommunications, Medical Laboratory Technology, Electrochemistry, Computer Science Hardware Architecture, Chemistry Analytical, Biology, Automation Control Systems, Anesthesiology2
Substance Abuse, Sport Sciences, Social Issues, Psychology Psychoanalysis, Optics Otorhinolaryngology, Medicine General Internal, Materials Science Multidisciplinary, Materials Science Biomaterials, Integrative Complementary Medicine, Ethics, Endocrinology Metabolism, Geriatrics Gerontology, Genetics Heredity, Imaging Science Photographic Technology, Health Care Sciences Services, Education Educational Research, Biotechnology Applied Microbiology, Chemistry Medicinal, Acoustics1

Appendix B

Appendix B.1. Trends in Number of Publications for EEG-Based Emotion Recognition

Figure A1 is a line graph showing articles published in the time frame from 2005 to 2016. Between 2005 and 2009, the graph shows a steady rise in EEG-based emotion detection publications. Thereafter, article publications rose rapidly from 2010 to 2016. The rapid rise in EEG-based emotion detection publications is expected due to increased knowledge of neurobiological processes, computers with faster computational processing, greater availability of devices for recording brain signals, and more powerful signal processing and machine learning algorithms. All of these factors positively contributed to the development of improved affective BCI technology. Aligned with the Gartner Hype Cycle, there was a marked increase in research articles related to Affective Computing.
Figure A1. Classification of articles by publication year.
Figure A1. Classification of articles by publication year.
Applsci 07 01239 g0a1
The graph also shows that, between 2005 and 2008, only 10 articles related to the subject were published. After that, the volume of publications rose sharply, culminating in a spike in 2009, with 15 articles published. Post-2009, the volume of publications declined, with an average of 17 articles published between 2010 and 2013. Soon after, the number of articles published increased from 46 articles in 2014 to 48 articles in 2015. By 2016, the number of published articles significantly increased to 62.

Appendix B.2. Classification of Articles by Research Area

The distribution of the results, according to the Web of Science categorization, is shown in Table A1. In this table, we can see that most of the articles were published in journals related to neuroscience, biology, physiology, computer science, and electronic engineering.
To classify these articles by their research areas, we had to determine into which broad areas they fit. Such areas as Neuroscience, Engineering, Computer Science, and Healthcare were analyzed. In Figure A2 (shown below), we can see the inter-connectedness of EEG-based emotion detection systems across several disciplines. To create this plot, we performed the following steps: (1) determine a list of sub-subjects that could be related to EEG-based emotion detection; (2) research how each subject is related to the others on an individual basis; and finally (3) visually display, using a Venn diagram, not only how the subjects are related to one another but also how strongly they are related to the main subject, which is the study of EEG-based emotion detection systems.
The strength of that relationship is directly related to the size of the bubble. The relationships between subjects were obtained using Google Scholar and Web of Science. Different published papers that are related to the sub-subjects show stronger correlations between various fields of study.
Figure A2. Relationships among major underlying fields.
Figure A2. Relationships among major underlying fields.
Applsci 07 01239 g0a2

Appendix B.3. Classification of Articles by Online Database and Journal

There are a total of 285 articles, published by 34 online databases. Classifications of articles by online database and journal are shown in Table A2 and Table A3, respectively.
In Table A2, we specify our top ten online article databases. Each of these hosted a minimum of five article publications that matched our search criteria. Of these, Science Direct had the highest percentage of articles (103 articles, 36.1%). The likely reason for this is that Science Direct publishes a coterie of different journals, such as Neurocomputing; International Journal of Psychophysiology; Biomedical Signal Processing and Control; Computers in Biology, Medicine, Brain, and Cognition; Brain Research; and Computers in Human Behavior. All of these journals publish EEG-based emotion recognition system articles. Similarly, Springer Link also hosts a cross-section of journal articles from various fields. Consequently, their database recorded the second-highest number of articles: 27 articles, or 9.5% of our total research articles. Other online databases are IEEE Xplore (23 articles, or 8.1%), Frontiers (22 articles, or 7.7%), Taylor & Francis (16 articles, or 5.6%), and Wiley Online Library (12 articles, or 4.2%).
Table A2. Classification of articles based on the online database.
Table A2. Classification of articles based on the online database.
Online DatabaseNumber of ArticlesTimeframeOnline databaseNumbe of ArticlesTimeframe
Science Direct1032005–2016Wiley Online Library122007–2015
Springer Link272006–2016Plos.org102012–2016
IEEE Xplore232006–2016World Scientific82010–2016
Frontiers 222010–2016Hindawi62013–2016
Taylor & Francis162006–2016Oxford62008–2016
Table A3 shows the number of articles in each journal publication. We only documented journals with more than two published articles. Most of these journals were related to neuroscience, biology, physiology, computer science, electronic engineering, and healthcare. Of these, PLOS ONE published 10 articles (or 3.5%); International Journal of Psychophysiology and IEEE Transactions on Affective Computing each published 9 articles (or 3.2%).
Table A3. Classification of articles based on the journal.
Table A3. Classification of articles based on the journal.
Journal (Impact Factor IF, Citation C)Number of Articles
PLOS ONE (IF: 2.806, C: 188)10
International Journal of Psychophysiology (IF: 2.582, C: 360), IEEE Transactions on Affective Computing (IF: 3.149, C: 1593)9
Frontiers in Human Neuroscience (IF: 3.209, C: 73), Neuroimage (IF: 5.835, C: 283), Frontiers in Psychology (IF: 2.323, C: 71)7
Neurocomputing (IF: 3.317, C: 185), Neuropsychologia (IF: 3.197, C: 321), Social Neuroscience (IF: 2.255, C: 185)6
Frontiers in Neuroscience (IF: 3.566, C: 102), Neuroscience Letters (IF: 2.180, C: 139), Clinical Neurophysiology (IF: 3.866, C: 149)5
IEEE Transactions on Information Technology in Biomedicine (IF: 2.493, C: 558), Sensors (IF: 2.677, C: 30), Social Cognitive and Affective Neuroscience (IF: 3.937, C: 99), Biological Psychology (IF: 3.070, C: 40), Brain and Cognition (IF: 2.432, C: 194), Schizophrenia Research (IF: 3.986, C: 227)4
IEEE Transactions on Autonomous Mental Development (IF: 1.638, C: 49), Journal of Visualized Experiments (IF: 1.325, C: 5), Behavioural Brain Research (IF: 3.002, C: 39), Computers in Human Behavior (IF: 3.435, C: 31), Psychiatry Research (IF: 2.528, C: 18), Cognitive Affective & Behavioral Neuroscience (IF: 3.263, C: 42), Cognitive Neurodynamics (IF: 1.828, C: 17), Journal of Neural Transmission (IF: 2.392, C: 86)3

References

  1. Pun, T.; Alecu, T.L.; Chanel, G.; Kronegg, J.; Voloshynovskiy, S. Brain-computer interaction research at the computer vision and multimedia laboratory, university of geneva. IEEE Trans. Neural Syst. Rehabil. Eng. 2006, 14, 210–213. [Google Scholar] [CrossRef] [PubMed]
  2. Esfahani, E.T.; Sundararajan, V. Using brain-computer interfaces to detect human satisfaction in human-robot interaction. Int. J. Humanoid Robot. 2011, 8, 87–101. [Google Scholar] [CrossRef]
  3. Schupp, H.T.; Flaisch, T.; Stockburger, J.; Junghofer, M. Emotion and attention: Event-related brain potential studies. Prog. Brain Res. 2006, 156, 31–51. [Google Scholar] [PubMed]
  4. Chien, V.S.C.; Tsai, A.C.; Yang, H.H.; Tseng, Y.L.; Savostyanov, A.N.; Liou, M. Conscious and non-conscious representations of emotional faces in asperger’s syndrome. JOVE J. Vis. Exp. 2016. [Google Scholar] [CrossRef] [PubMed]
  5. Csukly, G.; Stefanics, G.; Komlosi, S.; Czigler, I.; Czobor, P. Event-related theta synchronization predicts deficit in facial affect recognition in schizophrenia. J. Abnorm. Psychol. 2014, 123, 178–189. [Google Scholar] [CrossRef] [PubMed]
  6. Friedrich, E.V.C.; Sivanathan, A.; Lim, T.; Suttie, N.; Louchart, S.; Pillen, S.; Pineda, J.A. An effective neurofeedback intervention to improve social interactions in children with autism spectrum disorder. J. Autism Dev. Disord. 2015, 45, 4084–4100. [Google Scholar] [CrossRef] [PubMed]
  7. Liberati, G.; Federici, S.; Pasqualotto, E. Extracting neurophysiological signals reflecting users’ emotional and affective responses to bci use: A systematic literature review. Neurorehabilitation 2015, 37, 341–358. [Google Scholar] [CrossRef] [PubMed]
  8. Verma, G.K.; Tiwary, U.S. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. Neuroimage 2014, 102, 162–172. [Google Scholar] [CrossRef] [PubMed]
  9. Rule, N.O.; Freeman, J.B.; Ambady, N. Culture in social neuroscience: A review. Soc. Neurosci. 2013, 8, 3–10. [Google Scholar] [CrossRef] [PubMed]
  10. Keysers, C.; Fadiga, L. The mirror neuron system: New frontiers. Soc. Neurosci. 2008, 3, 193–198. [Google Scholar] [CrossRef] [PubMed]
  11. Grossmann, T.; Johnson, M.H. The development of the social brain in human infancy. Eur. J. Neurosci. 2007, 25, 909–919. [Google Scholar] [CrossRef] [PubMed]
  12. Muthukumaraswamy, S.D.; Johnson, B.W. A dual mechanism neural framework for social understanding. Philos. Psychol. 2007, 20, 43–63. [Google Scholar] [CrossRef]
  13. Alotaiby, T.; Abd El-Samie, F.E.; Alshebeili, S.A.; Ahmad, I. A review of channel selection algorithms for EEG signal processing. EURASIP J. Adv. Signal Process. 2015. [Google Scholar] [CrossRef]
  14. Jenke, R.; Peer, A.; Buss, M. Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 2014, 5, 327–339. [Google Scholar] [CrossRef]
  15. Knyazev, G.G. Motivation, emotion, and their inhibitory control mirrored in brain oscillations. Neurosci. Biobehav. Rev. 2007, 31, 377–395. [Google Scholar] [CrossRef] [PubMed]
  16. Kim, M.K.; Kim, M.; Oh, E.; Kim, S.P. A review on the computational methods for emotional state estimation from the human EEG. Comput. Math. Methods Med. 2013, 2013. [Google Scholar] [CrossRef] [PubMed]
  17. Isaac, C.; Januel, D. Neural correlates of cognitive improvements following cognitive remediation in schizophrenia: A systematic review of randomized trials. Socioaffect. Neurosci. Psychol. 2016, 6, 30054. [Google Scholar] [CrossRef] [PubMed]
  18. Campos, C.; Santos, S.; Gagen, E.; Machado, S.; Rocha, S.; Kurtz, M.M.; Rocha, N.B. Neuroplastic changes following social cognition training in schizophrenia: A systematic review. Neuropsychol. Rev. 2016, 26, 310–328. [Google Scholar] [CrossRef] [PubMed]
  19. Harrison, A.H.; Connolly, J.F. Finding a way in: A review and practical evaluation of fmri and EEG for detection and assessment in disorders of consciousness. Neurosci. Biobehav. Rev. 2013, 37, 1403–1419. [Google Scholar] [CrossRef] [PubMed]
  20. Acharya, U.R.; Sudarshan, V.K.; Adeli, H.; Santhosh, J.; Koh, J.E.W.; Adeli, A. Computer-aided diagnosis of depression using EEG signals. Eur. Neurol. 2015, 73, 329–336. [Google Scholar] [CrossRef] [PubMed]
  21. Bhat, S.; Acharya, U.R.; Adeli, H.; Bairy, G.M.; Adeli, A. Automated diagnosis of autism: In search of a mathematical marker. Rev. Neurosci. 2014, 25, 851–861. [Google Scholar] [CrossRef] [PubMed]
  22. Bontchev, B. Adaptation in affective video games: A literature review. Cybern. Inf. Technol. 2016, 16, 3–34. [Google Scholar] [CrossRef]
  23. Reyes-Munoz, A.; Domingo, M.C.; Lopez-Trinidad, M.A.; Delgado, J.L. Integration of body sensor networks and vehicular ad-hoc networks for traffic safety. Sensors 2016, 16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Gruzelier, J.H. EEG-neurofeedback for optimising performance. I: A review of cognitive and affective outcome in healthy participants. Neurosci. Biobehav. Rev. 2014, 44, 124–141. [Google Scholar] [CrossRef] [PubMed]
  25. Harmon-Jones, E.; Amodio, D.M.; Harmon-Jones, C. Action-based model of dissonance: A review, integration, and expansion of conceptions of cognitive conflict. Adv. Exp. Soc. Psychol. 2009, 41, 119–166. [Google Scholar]
  26. Kroupi, E.; Vesin, J.M.; Ebrahimi, T. Subject-independent odor pleasantness classification using brain and peripheral signals. IEEE Trans. Affect. Comput. 2016, 7, 422–434. [Google Scholar] [CrossRef]
  27. Keuper, K.; Zwitserlood, P.; Rehbein, M.A.; Eden, A.S.; Laeger, I.; Junghofer, M.; Zwanzger, P.; Dobel, C. Early prefrontal brain responses to the hedonic quality of emotional words—A simultaneous EEG and MEG study. PLoS ONE 2013, 8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Balconi, M.; Cobelli, C. Motivational mechanisms (bas) and prefrontal cortical activation contribute to recognition memory for emotional words. Rtms effect on performance and EEG (alpha band) measures. Brain Lang. 2014, 137, 77–85. [Google Scholar] [CrossRef] [PubMed]
  29. Briesemeister, B.B.; Kuchinke, L.; Jacobs, A.M. Emotion word recognition: Discrete information effects first, continuous later? Brain Res. 2014, 1564, 62–71. [Google Scholar] [CrossRef] [PubMed]
  30. Kamp, S.M.; Potts, G.F.; Donchin, E. On the roles of distinctiveness and semantic expectancies in episodic encoding of emotional words. Psychophysiology 2015, 52, 1599–1609. [Google Scholar] [CrossRef] [PubMed]
  31. Mueller, C.J.; Kuchinke, L. Individual differences in emotion word processing: A diffusion model analysis. Cogn. Affect. Behav. Neurosci. 2016, 16, 489–501. [Google Scholar] [CrossRef] [PubMed]
  32. Imbir, K.K.; Spustek, T.; Zygierewicz, J. Effects of valence and origin of emotions in word processing evidenced by event related potential correlates in a lexical decision task. Front. Psychol. 2016, 7. [Google Scholar] [CrossRef] [PubMed]
  33. Novosel, A.; Lackner, N.; Unterrainer, H.F.; Dunitz-Scheer, M.; Scheer, P.J.Z.; Wallner-Liebmann, S.J.; Neuper, C. Motivational processing of food cues in anorexia nervosa: A pilot study. Eat. Weight Disord. Stud. Anorex. Bulim. Obes. 2014, 19, 169–175. [Google Scholar] [CrossRef] [PubMed]
  34. Chanel, G.; Rebetez, C.; Betrancourt, M.; Pun, T. Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2011, 41, 1052–1063. [Google Scholar] [CrossRef]
  35. Tzieropoulos, H.; de Peralta, R.G.; Bossaerts, P.; Andino, S.L.G. The impact of disappointment in decision making: Inter-individual differences and electrical neuroimaging. Front. Hum. Neurosci. 2011, 4. [Google Scholar] [CrossRef] [PubMed]
  36. Spape, M.M.; Kivikangas, J.M.; Jarvela, S.; Kosunen, I.; Jacucci, G.; Ravaja, N. Keep your opponents close: Social context affects EEG and femg linkage in a turn-based computer game. PLoS ONE 2013, 8. [Google Scholar] [CrossRef] [PubMed]
  37. Mothes, H.; Enge, S.; Strobel, A. The interplay between feedback-related negativity and individual differences in altruistic punishment: An EEG study. Cogn. Affect. Behav. Neurosci. 2016, 16, 276–288. [Google Scholar] [CrossRef] [PubMed]
  38. Charland, P.; Leger, P.M.; Senecal, S.; Courtemanche, F.; Mercier, J.; Skelling, Y.; Labonte-Lemoyne, E. Assessing the multiple dimensions of engagement to characterize learning: A neurophysiological perspective. JOVE J. Vis. Exp. 2015. [Google Scholar] [CrossRef] [PubMed]
  39. Lopez-Gil, J.M.; Virgili-Goma, J.; Gil, R.; Garcia, R. Method for improving EEG based emotion recognition by combining it with synchronized biometric and eye tracking technologies in a non-invasive and low cost way. Front. Comput. Neurosci. 2016, 10. [Google Scholar] [CrossRef]
  40. Abdur-Rahim, J.; Morales, Y.; Gupta, P.; Umata, I.; Watanabe, A.; Even, J.; Suyama, T.; Ishii, S. Multi-sensor based state prediction for personal mobility vehicles. PLoS ONE 2016, 11. [Google Scholar] [CrossRef] [PubMed]
  41. Huang, X.H.; Kortelainen, J.; Zhao, G.Y.; Li, X.B.; Moilanen, A.; Seppanen, T.; Pietikainen, M. Multi-modal emotion analysis from facial expressions and electroencephalogram. Comput. Vis. Image Underst. 2016, 147, 114–124. [Google Scholar] [CrossRef]
  42. Wen, W.H.; Qiu, Y.H.; Liu, G.Y.; Cheng, N.P.; Huang, X.T. Construction and cross-correlation analysis of the affective physiological response database. Sci. China Inf. Sci. 2010, 53, 1774–1784. [Google Scholar] [CrossRef]
  43. Leventon, J.S.; Stevens, J.S.; Bauer, P.J. Development in the neurophysiology of emotion processing and memory in school-age children. Dev. Cogn. Neurosci. 2014, 10, 21–33. [Google Scholar] [CrossRef] [PubMed]
  44. Daly, I.; Williams, D.; Kirke, A.; Weaver, J.; Malik, A.; Hwang, F.; Miranda, E.; Nasuto, S.J. Affective brain-computer music interfacing. J. Neural Eng. 2016, 13. [Google Scholar] [CrossRef] [PubMed]
  45. Daly, I.; Chen, L.; Zhou, S.J.; Jin, J. An investigation into the use of six facially encoded emotions in brain-computer interfacing. Brain Comput. Interfaces 2016, 3, 59–73. [Google Scholar] [CrossRef]
  46. Knyazev, G.G.; Slobodskoj-Plusnin, J.Y.; Bocharov, A.V. Gender differences in implicit and explicit processing of emotional facial expressions as revealed by event-related theta synchronization. Emotion 2010, 10, 678–687. [Google Scholar] [CrossRef] [PubMed]
  47. Wieser, M.J.; Muhlberger, A.; Alpers, G.W.; Macht, M.; Ellgring, H.; Pauli, P. Emotion processing in Parkinson’s disease: Dissociation between early neuronal processing and explicit ratings. Clin. Neurophysiol. 2006, 117, 94–102. [Google Scholar] [CrossRef] [PubMed]
  48. Schaefer, A.; Pottage, C.L.; Rickart, A.J. Electrophysiological correlates of remembering emotional pictures. Neuroimage 2011, 54, 714–724. [Google Scholar] [CrossRef] [PubMed]
  49. Zheng, W.L.; Lu, B.L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
  50. Peng, Y.; Zheng, W.L.; Lu, B.L. An unsupervised discriminative extreme learning machine and its applications to data clustering. Neurocomputing 2016, 174, 250–264. [Google Scholar] [CrossRef]
  51. Schaefer, A.; Fletcher, K.; Pottage, C.L.; Alexander, K.; Brown, C. The effects of emotional intensity on ERP correlates of recognition memory. Neuroreport 2009, 20, 319–324. [Google Scholar] [CrossRef] [PubMed]
  52. Lu, X.J.; Ho, H.T.; Liu, F.; Wu, D.X.; Thompson, W.F. Intonation processing deficits of emotional words among mandarin chinese speakers with congenital amusia: An erp study. Front. Psychol. 2015, 6. [Google Scholar] [CrossRef] [PubMed]
  53. Lin, Y.P.; Yang, Y.H.; Jung, T.P. Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening. Front. Neurosci. 2014, 8. [Google Scholar] [CrossRef] [PubMed]
  54. Turetsky, B.I.; Kohler, C.G.; Indersmitten, T.; Bhati, M.T.; Charbonnier, D.; Gur, R.C. Facial emotion recognition in schizophrenia: When and why does it go awry? Schizophr. Res. 2007, 94, 253–263. [Google Scholar] [CrossRef] [PubMed]
  55. Chen, X.H.; Yang, J.F.; Gan, S.Z.; Yang, Y.F. The contribution of sound intensity in vocal emotion perception: Behavioral and electrophysiological evidence. PLoS ONE 2012, 7. [Google Scholar] [CrossRef] [PubMed]
  56. Wang, L.; Bastiaansen, M. Oscillatory brain dynamics associated with the automatic processing of emotion in words. Brain Lang. 2014, 137, 120–129. [Google Scholar] [CrossRef] [PubMed]
  57. Wang, X.W.; Nie, D.; Lu, B.L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
  58. Lin, Y.P.; Duann, J.R.; Feng, W.F.; Chen, J.H.; Jung, T.P. Revealing spatio-spectral electroencephalographic dynamics of musical mode and tempo perception by independent component analysis. J. Neuroeng. Rehabil. 2014, 11. [Google Scholar] [CrossRef] [PubMed]
  59. Zhang, W.H.; Li, X.Y.; Liu, X.; Duan, X.X.; Wang, D.H.; Shen, J.L. Distraction reduces theta synchronization in emotion regulation during adolescence. Neurosci. Lett. 2013, 550, 81–86. [Google Scholar] [CrossRef] [PubMed]
  60. Calvo, M.G.; Beltran, D. Recognition advantage of happy faces: Tracing the neurocognitive processes. Neuropsychologia 2013, 51, 2051–2060. [Google Scholar] [CrossRef] [PubMed]
  61. Liu, Y.H.; Wu, C.T.; Cheng, W.T.; Hsiao, Y.T.; Chen, P.M.; Teng, J.T. Emotion recognition from single-trial EEG based on kernel fisher’s emotion pattern and imbalanced quasiconformal kernel support vector machine. Sensors 2014, 14, 13361–13388. [Google Scholar] [CrossRef] [PubMed]
  62. Brennan, A.M.; Harris, A.W.F.; Williams, L.M. Neural processing of facial expressions of emotion in first onset psychosis. Psychiatry Res. 2014, 219, 477–485. [Google Scholar] [CrossRef] [PubMed]
  63. Liu, S.; Zhang, D.; Xu, M.P.; Qi, H.Z.; He, F.; Zhao, X.; Zhou, P.; Zhang, L.X.; Ming, D. Randomly dividing homologous samples leads to overinflated accuracies for emotion recognition. Int. J. Psychophysiol. 2015, 96, 29–37. [Google Scholar] [CrossRef] [PubMed]
  64. Perez-Edgar, K.; Kujawa, A.; Nelson, S.K.; Cole, C.; Zapp, D.J. The relation between electroencephalogram asymmetry and attention biases to threat at baseline and under stress. Brain Cogn. 2013, 82, 337–343. [Google Scholar] [CrossRef] [PubMed]
  65. Liu, T.R.; Xiao, T.; Shi, J.N. Automatic change detection to facial expressions in adolescents: Evidence from visual mismatch negativity responses. Front. Psychol. 2016, 7. [Google Scholar] [CrossRef] [PubMed]
  66. Kim, D.W.; Kim, H.S.; Lee, S.H.; Im, C.H. Positive and negative symptom scores are correlated with activation in different brain regions during facial emotion perception in schizophrenia patients: A voxel-based sloreta source activity study. Schizophr. Res. 2013, 151, 165–174. [Google Scholar] [CrossRef] [PubMed]
  67. Lin, H.Y.; Xiang, J.; Li, S.L.; Liang, J.F.; Jin, H. Anticipation of negative pictures enhances the p2 and p3 in their later recognition. Front. Hum. Neurosci. 2015, 9. [Google Scholar] [CrossRef] [PubMed]
  68. Zhang, D.D.; Wang, L.L.; Luo, Y.; Luo, Y.J. Individual differences in detecting rapidly presented fearful faces. PLoS ONE 2012, 7. [Google Scholar] [CrossRef] [PubMed]
  69. Yu, B.; Ma, L.; Li, H.F.; Zhao, L.; Bo, H.J.; Wang, X.D. Biological computation indexes of brain oscillations in unattended facial expression processing based on event-related synchronization/desynchronization. Comput. Math. Methods Med. 2016. [Google Scholar] [CrossRef] [PubMed]
  70. Zhang, W.H.; Lu, J.M.; Liu, X.; Fang, H.L.; Li, H.; Wang, D.H.; Shen, J.L. Event-related synchronization of delta and beta oscillations reflects developmental changes in the processing of affective pictures during adolescence. Int. J. Psychophysiol. 2013, 90, 334–340. [Google Scholar] [CrossRef] [PubMed]
  71. Wang, S.F.; Zhu, Y.C.; Wu, G.B.; Ji, Q. Hybrid video emotional tagging using users' EEG and video content. Multimed. Tools Appl. 2014, 72, 1257–1283. [Google Scholar] [CrossRef]
  72. Williams, L.M.; Whitford, T.J.; Nagy, M.; Flynn, G.; Harris, A.W.F.; Silverstein, S.M.; Gordon, E. Emotion-elicited gamma synchrony in patients with first-episode schizophrenia: A neural correlate of social cognition outcomes. J. Psychiatry Neurosci. 2009, 34, 303–313. [Google Scholar] [PubMed]
  73. Andrews, S.C.; Enticott, P.G.; Hoy, K.E.; Thomson, R.H.; Fitzgerald, P.B. No evidence for mirror system dysfunction in schizophrenia from a multimodal tms/EEG study. Psychiatry Res. 2015, 228, 431–440. [Google Scholar] [CrossRef] [PubMed]
  74. Shen, X.B.; Wu, Q.; Zhao, K.; Fu, X.L. Electrophysiological evidence reveals differences between the recognition of microexpressions and macroexpressions. Front. Psychol. 2016, 7. [Google Scholar] [CrossRef] [PubMed]
  75. Kylliainen, A.; Wallace, S.; Coutanche, M.N.; Leppanen, J.M.; Cusack, J.; Bailey, A.J.; Hietanen, J.K. Affective-motivational brain responses to direct gaze in children with autism spectrum disorder. J. Child Psychol. Psychiatry 2012, 53, 790–797. [Google Scholar] [CrossRef] [PubMed]
  76. Croft, R.J.; McKernan, F.; Gray, M.; Churchyard, A.; Georgiou-Karistianis, N. Emotion perception and electrophysiological correlates in Huntington’s disease. Clin. Neurophysiol. 2014, 125, 1618–1625. [Google Scholar] [CrossRef] [PubMed]
  77. Beltran, D.; Calvo, M.G. Brain signatures of perceiving a smile: Time course and source localization. Hum. Brain Mapp. 2015, 36, 4287–4303. [Google Scholar] [CrossRef] [PubMed]
  78. Lin, H.Y.; Schulz, C.; Straube, T. Cognitive tasks during expectation affect the congruency ERP effects to facial expressions. Front. Hum. Neurosci. 2015, 9. [Google Scholar] [CrossRef] [PubMed]
  79. Hilimire, M.R.; Mayberg, H.S.; Holtzheimer, P.E.; Broadway, J.M.; Parks, N.A.; DeVylder, J.E.; Corballis, P.M. Effects of subcallosal cingulate deep brain stimulation on negative self-bias in patients with treatment-resistant depression. Brain Stimul. 2015, 8, 185–191. [Google Scholar] [CrossRef] [PubMed]
  80. Makin, A.D.J.; Wilton, M.M.; Pecchinenda, A.; Bertamini, M. Symmetry perception and affective responses: A combined EEG/emg study. Neuropsychologia 2012, 50, 3250–3261. [Google Scholar] [CrossRef] [PubMed]
  81. Magnée, M.J.; Gelder, B.; Engeland, H.; Kemner, C. A typical processing of fearful face-voice pairs in pervasive developmental disorder: An ERP study. Clin. Neurophysiol. 2008, 119, 2004–2010. [Google Scholar] [CrossRef] [PubMed]
  82. Matsuda, I.; Nittono, H.; Allen, J.J.B. Detection of concealed information by p3 and frontal EEG asymmetry. Neurosci. Lett. 2013, 537, 55–59. [Google Scholar] [CrossRef] [PubMed]
  83. Lin, H.Y.; Schulz, C.; Straube, T. Fearful contextual expression impairs the encoding and recognition of target faces: An ERP study. Front. Behav. Neurosci. 2015, 9. [Google Scholar] [CrossRef] [PubMed]
  84. Dennis, T.A.; Hajcak, G. The late positive potential: A neurophysiological marker for emotion regulation in children. J. Child Psychol. Psychiatry 2009, 50, 1373–1383. [Google Scholar] [CrossRef] [PubMed]
  85. Codispoti, M.; De Cesarei, A.; Ferrari, V. The influence of color on emotional perception of natural scenes. Psychophysiology 2012, 49, 11–16. [Google Scholar] [CrossRef] [PubMed]
  86. Gallant, S.N.; Dyson, B.J. Neural modulation of directed forgetting by valence and arousal: An event-related potential study. Brain Res. 2016, 1648, 306–316. [Google Scholar] [CrossRef] [PubMed]
  87. Newsome, R.N.; Dulas, M.R.; Duarte, A. The effects of aging on emotion-induced modulations of source retrieval ERPS: Evidence for valence biases. Neuropsychologia 2012, 50, 3370–3384. [Google Scholar] [CrossRef] [PubMed]
  88. Soleymani, M.; Pantic, M.; Pun, T. Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 2012, 3, 211–223. [Google Scholar] [CrossRef]
  89. Lindstrom, R.; Lepisto, T.; Makkonen, T.; Kujala, T. Processing of prosodic changes in natural speech stimuli in school-age children. Int. J. Psychophysiol. 2012, 86, 229–237. [Google Scholar] [CrossRef] [PubMed]
  90. Komlosi, S.; Csukly, G.; Stefanics, G.; Czigler, I.; Bitter, I.; Czobor, P. Fearful face recognition in schizophrenia: An electrophysiological study. Schizophr. Res. 2013, 149, 135–140. [Google Scholar] [CrossRef] [PubMed]
  91. Achaibou, A.; Pourtois, G.; Schwartz, S.; Vuilleumier, P. Simultaneous recording of EEG and facial muscle reactions during spontaneous emotional mimicry. Neuropsychologia 2008, 46, 1104–1113. [Google Scholar] [CrossRef] [PubMed]
  92. Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. Deap: A database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef]
  93. Ponz, A.; Montant, M.; Liegeois-Chauvel, C.; Silva, C.; Braun, M.; Jacobs, A.M.; Ziegler, J.C. Emotion processing in words: A test of the neural re-use hypothesis using surface and intracranial EEG. Soc. Cogn. Affect. Neurosci. 2014, 9, 619–627. [Google Scholar] [CrossRef] [PubMed]
  94. Csukly, G.; Stefanics, G.; Komlosi, S.; Czigler, I.; Czobor, P. Emotion-related visual mismatch responses in schizophrenia: Impairments and correlations with emotion recognition. PLoS ONE 2013, 8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  95. Bhushan, V.; Saha, G.; Lindsen, J.; Shimojo, S.; Bhattacharya, J. How we choose one over another: Predicting trial-by-trial preference decision. PLoS ONE 2012, 7. [Google Scholar] [CrossRef] [PubMed]
  96. Kryuchkova, T.; Tucker, B.V.; Wurm, L.H.; Baayen, R.H. Danger and usefulness are detected early in auditory lexical processing: Evidence from electroencephalography. Brain Lang. 2012, 122, 81–91. [Google Scholar] [CrossRef] [PubMed]
  97. Schirmer, A.; Escoffier, N.; Li, Q.Y.; Li, H.; Strafford-Wilson, J.; Lie, W.I. What grabs his attention but not hers? Estrogen correlates with neurophysiological measures of vocal change detection. Psychoneuroendocrinology 2008, 33, 718–727. [Google Scholar] [CrossRef] [PubMed]
  98. Csukly, G.; Farkas, K.; Marosi, C.; Szabo, A. Deficits in low beta desynchronization reflect impaired emotional processing in schizophrenia. Schizophr. Res. 2016, 171, 207–214. [Google Scholar] [CrossRef] [PubMed]
  99. Hagemann, J.; Straube, T.; Schulz, C. Too bad: Bias for angry faces in social anxiety interferes with identity processing. Neuropsychologia 2016, 84, 136–149. [Google Scholar] [CrossRef] [PubMed]
  100. Liu, Y.; Wang, C.G.; Wang, X.H.; Zhou, P.Y.; Yu, G.N.; Chan, K.C.C. What strikes the strings of your heart?-multi-label dimensionality reduction for music emotion analysis via brain imaging. IEEE Trans. Auton. Ment. Dev. 2015, 7, 176–188. [Google Scholar]
  101. Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef]
  102. Bercik, J.; Horska, E.; Wang, R.W.Y.; Chen, Y.C. The impact of parameters of store illumination on food shopper response. Appetite 2016, 106, 101–109. [Google Scholar] [CrossRef] [PubMed]
  103. Martinez, F.; Barraza, C.; Gonzalez, N.; Gonzalez, J. Kapean: Understanding affective states of children with adhd. Educ. Technol. Soc. 2016, 19, 18–28. [Google Scholar]
  104. Mehmood, R.M.; Lee, H.J. A novel feature extraction method based on late positive potential for emotion recognition in human brain signal patterns. Comput. Electr. Eng. 2016, 53, 444–457. [Google Scholar] [CrossRef]
  105. Meza-Kubo, V.; Moran, A.L.; Carrillo, I.; Galindo, G.; Garcia-Canseco, E. Assessing the user experience of older adults using a neural network trained to recognize emotions from brain signals. J. Biomed. Inform. 2016, 62, 202–209. [Google Scholar] [CrossRef] [PubMed]
  106. Yuvaraj, R.; Murugappan, M.; Acharya, U.R.; Adeli, H.; Ibrahim, N.M.; Mesquita, E. Brain functional connectivity patterns for emotional state classification in Parkinson’s disease patients without dementia. Behav. Brain Res. 2016, 298, 248–260. [Google Scholar] [CrossRef] [PubMed]
  107. Yuvaraj, R.; Murugappan, M. Hemispheric asymmetry non-linear analysis of EEG during emotional responses from idiopathic Parkinson’s disease patients. Cogn. Neurodyn. 2016, 10, 225–234. [Google Scholar] [CrossRef] [PubMed]
  108. Jatupaiboon, N.; Pan-ngum, S.; Israsena, P. Real-time EEG-based happiness detection system. Sci. World J. 2013. [Google Scholar] [CrossRef] [PubMed]
  109. Gil, R.; Virgili-Goma, J.; Garcia, R.; Mason, C. Emotions ontology for collaborative modelling and learning of emotional responses. Comput. Hum. Behav. 2015, 51, 610–617. [Google Scholar] [CrossRef]
  110. Hadjidimitriou, S.K.; Hadjileontiadis, L.J. EEG-based classification of music appraisal responses using time-frequency analysis and familiarity ratings. IEEE Trans. Affect. Comput. 2013, 4, 161–172. [Google Scholar] [CrossRef]
  111. Kuber, R.; Wright, F.P. Augmenting the instant messaging experience through the use of brain-computer interface and gestural technologies. Int. J. Hum. Comput. Interact. 2013, 29, 178–191. [Google Scholar] [CrossRef]
  112. Hadjidimitriou, S.K.; Hadjileontiadis, L.J. Toward an EEG-based recognition of music liking using time-frequency analysis. IEEE Trans. Biomed. Eng. 2012, 59, 3498–3510. [Google Scholar] [CrossRef] [PubMed]
  113. Choi, J.S.; Bang, J.W.; Heo, H.; Park, K.R. Evaluation of fear using nonintrusive measurement of multimodal sensors. Sensors 2015, 15, 17507–17533. [Google Scholar] [CrossRef] [PubMed]
  114. Yuvaraj, R.; Murugappan, M.; Omar, M.I.; Ibrahim, N.M.; Sundaraj, K.; Mohamad, K.; Satiyan, M. Emotion processing in Parkinson’s disease: An EEG spectral power study. Int. J. Neurosci. 2014, 124, 491–502. [Google Scholar] [CrossRef] [PubMed]
  115. Sourina, O.; Liu, Y.S.; Nguyen, M.K. Real-time EEG-based emotion recognition for music therapy. J. Multimodal User Interfaces 2012, 5, 27–35. [Google Scholar] [CrossRef]
  116. Shahabi, H.; Moghimi, S. Toward automatic detection of brain responses to emotional music through analysis of EEG effective connectivity. Comput. Hum. Behav. 2016, 58, 231–239. [Google Scholar] [CrossRef]
  117. Lan, Z.R.; Sourina, O.; Wang, L.P.; Liu, Y.S. Real-time EEG-based emotion monitoring using stable features. Vis. Comput. 2016, 32, 347–358. [Google Scholar] [CrossRef]
  118. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Sundaraj, K.; Omar, M.I.; Mohamad, K.; Palaniappan, R. Detection of emotions in Parkinson’s disease using higher order spectral features from brain’s electrical activity. Biomed. Signal Process. Control 2014, 14, 108–116. [Google Scholar] [CrossRef]
  119. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Omar, M.I.; Sundaraj, K.; Mohamad, K.; Palaniappan, R.; Satiyan, M. Emotion classification in Parkinson’s disease by higher-order spectra and power spectrum features using EEG signals: A comparative study. J. Integr. Neurosci. 2014, 13, 89–120. [Google Scholar] [CrossRef] [PubMed]
  120. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Sundaraj, K.; Omar, M.I.; Mohamad, K.; Palaniappan, R.; Satiyan, M. Inter-hemispheric EEG coherence analysis in Parkinson’s disease: Assessing brain activity during emotion processing. J. Neural Transm. 2015, 122, 237–252. [Google Scholar] [CrossRef] [PubMed]
  121. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Sundaraj, K.; Omar, M.I.; Mohamad, K.; Palaniappan, R. Optimal set of EEG features for emotional state classification and trajectory visualization in Parkinson’s disease. Int. J. Psychophysiol. 2014, 94, 482–495. [Google Scholar] [CrossRef] [PubMed]
  122. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Omar, M.I.; Sundaraj, K.; Mohamad, K.; Palaniappan, R.; Mesquita, E.; Satiyan, M. On the analysis of EEG power, frequency and asymmetry in Parkinson’s disease during emotion processing. Behav. Brain Funct. 2014, 10. [Google Scholar] [CrossRef] [PubMed]
  123. Yang, T.; Lee, D.Y.; Kwak, Y.; Choi, J.; Kim, C.; Kim, S.P. Evaluation of tv commercials using neurophysiological responses. J. Physiol. Anthropol. 2015, 34. [Google Scholar] [CrossRef] [PubMed]
  124. Sokhadze, E.M.; Tasman, A.; Tamas, R.; El-Mallakh, R.S. Event-related potential study of the effects of emotional facial expressions on task performance in euthymic bipolar patients. Appl. Psychophysiol. Biofeedback 2011, 36, 1–13. [Google Scholar] [CrossRef] [PubMed]
  125. Trentini, C.; Pagani, M.; Fania, P.; Speranza, A.M.; Nicolais, G.; Sibilia, A.; Inguscio, L.; Verardo, A.R.; Fernandez, I.; Ammaniti, M. Neural processing of emotions in traumatized children treated with eye movement desensitization and reprocessing therapy: A hdEEG study. Front. Psychol. 2015, 6. [Google Scholar] [CrossRef] [PubMed]
  126. O’Connor, K.; Hamm, J.P.; Kirk, I.J. The neurophysiological correlates of face processing in adults and children with asperger’s syndrome. Brain Cogn. 2005, 59, 82–95. [Google Scholar] [CrossRef] [PubMed]
  127. Sabbagh, M.A.; Flynn, J. Mid-frontal EEG alpha asymmetries predict individual differences in one aspect of theory of mind: Mental state decoding. Soc. Neurosci. 2006, 1, 299–308. [Google Scholar] [CrossRef] [PubMed]
  128. Dai, J.Q.; Zhai, H.C.; Wu, H.Y.; Yang, S.Y.; Cacioppo, J.T.; Cacioppo, S.; Luo, Y.J. Maternal face processing in mosuo preschool children. Biol. Psychol. 2014, 99, 69–76. [Google Scholar] [CrossRef] [PubMed]
  129. Todd, R.M.; Lewis, M.D.; Meusel, L.A.; Zelazo, P.D. The time course of social-emotional processing in early childhood: ERP responses to facial affect and personal familiarity in a go-nogo task. Neuropsychologia 2008, 46, 595–613. [Google Scholar] [CrossRef] [PubMed]
  130. Lahat, A.; Todd, R.M.; Mahy, C.E.V.; Lau, K.; Zelazo, P.D. Neurophysiological correlates of executive function: A comparison of european-canadian and chinese-canadian 5-year-old children. Front. Hum. Neurosci. 2010, 3. [Google Scholar] [CrossRef] [PubMed]
  131. Poolman, P.; Frank, R.M.; Luu, P.; Pederson, S.M.; Tucker, D.M. A single-trial analytic framework for EEG analysis and its application to target detection and classification. Neuroimage 2008, 42, 787–798. [Google Scholar] [CrossRef] [PubMed]
  132. Mai, X.Q.; Xu, L.; Li, M.Y.; Shao, J.; Zhao, Z.Y.; Lamm, C.; Fox, N.A.; Nelson, C.A.; Lozoff, B. Sounds elicit relative left frontal alpha activity in 2-month-old infants. Int. J. Psychophysiol. 2014, 94, 287–291. [Google Scholar] [CrossRef] [PubMed]
  133. Noll, L.K.; Mayes, L.C.; Rutherford, H.J.V. Investigating the impact of parental status and depression symptoms on the early perceptual coding of infant faces: An event-related potential study. Soc. Neurosci. 2012, 7, 525–536. [Google Scholar] [CrossRef] [PubMed]
  134. Bornstein, M.H.; Arterberry, M.E.; Mash, C. Differentiated brain activity in response to faces of “own” versus “unfamiliar” babies in primipara mothers: An electrophysiological study. Dev. Neuropsychol. 2013, 38, 365–385. [Google Scholar] [CrossRef] [PubMed]
  135. Auerbach, R.P.; Stewart, J.G.; Stanton, C.H.; Mueller, E.M.; Pizzagalli, D.A. Emotion-processing biases and resting EEG activity in depressed adolescents. Depress. Anxiety 2015, 32, 693–701. [Google Scholar] [CrossRef] [PubMed]
  136. Kashihara, K. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions. Front. Neurosci. 2014, 8. [Google Scholar] [CrossRef] [PubMed]
  137. Andino, S.L.G.; Menendez, R.G.D.; Khateb, A.; Landis, T.; Pegna, A.J. Electrophysiological correlates of affective blindsight. Neuroimage 2009, 44, 581–589. [Google Scholar] [PubMed]
  138. Wieser, M.J.; Keil, A. Fearful faces heighten the cortical representation of contextual threat. Neuroimage 2014, 86, 317–325. [Google Scholar] [CrossRef] [PubMed]
  139. Pincham, H.L.; Bryce, D.; Fearon, R.M.P. The neural correlates of emotion processing in juvenile offenders. Dev. Sci. 2015, 18, 994–1005. [Google Scholar] [CrossRef] [PubMed]
  140. Apicella, F.; Sicca, F.; Federico, R.R.; Campatelli, G.; Muratori, F. Fusiform gyrus responses to neutral and emotional faces in children with autism spectrum disorders: A high density ERP study. Behav. Brain Res. 2013, 251, 155–162. [Google Scholar] [CrossRef] [PubMed]
  141. Deweese, M.M.; Bradley, M.M.; Lang, P.J.; Andersen, S.K.; Muller, M.M.; Keil, A. Snake fearfulness is associated with sustained competitive biases to visual snake features: Hypervigilance without avoidance. Psychiatry Res. 2014, 219, 329–335. [Google Scholar] [CrossRef] [PubMed]
  142. Rochas, V.; Rihs, T.A.; Rosenberg, N.; Landis, T.; Michel, C.M. Very early processing of emotional words revealed in temporoparietal junctions of both hemispheres by EEG and tms. Exp. Brain Res. 2014, 232, 1267–1281. [Google Scholar] [CrossRef] [PubMed]
  143. Akano, A.J.; Haley, D.W.; Dudek, J. Investigating social cognition in infants and adults using dense array electroencephalography ((d)EEG). JOVE J. Vis. Exp. 2011. [Google Scholar] [CrossRef] [PubMed]
  144. Stothart, G.; Maynard, O.; Lavis, R.; Munafo, M. Neural correlates of cigarette health warning avoidance among smokers. Drug Alcohol Depend. 2016, 161, 155–162. [Google Scholar] [CrossRef] [PubMed]
  145. Degabriele, R.; Lagopoulos, J.; Malhi, G. Neural correlates of emotional face processing in bipolar disorder: An event-related potential study. J. Affect. Disord. 2011, 133, 212–220. [Google Scholar] [CrossRef] [PubMed]
  146. Reicherts, P.; Wieser, M.J.; Gerdes, A.B.M.; Likowski, K.U.; Weyers, P.; Muhlberger, A.; Pauli, P. Electrocortical evidence for preferential processing of dynamic pain expressions compared to other emotional expressions. Pain 2012, 153, 1959–1964. [Google Scholar] [CrossRef] [PubMed]
  147. Chen, X.H.; Han, L.Z.; Pan, Z.H.; Luo, Y.M.; Wang, P. Influence of attention on bimodal integration during emotional change decoding: ERP evidence. Int. J. Psychophysiol. 2016, 106, 14–20. [Google Scholar] [CrossRef] [PubMed]
  148. Chen, X.H.; Pan, Z.H.; Wang, P.; Yang, X.H.; Liu, P.; You, X.Q.; Yuan, J.J. The integration of facial and vocal cues during emotional change perception: EEG markers. Soc. Cogn. Affect. Neurosci. 2016, 11, 1152–1161. [Google Scholar] [CrossRef] [PubMed]
  149. Balconi, M.; Grippa, E.; Vanutelli, M.E. What hemodynamic (fnirs), electrophysiological (EEG) and autonomic integrated measures can tell us about emotional processing. Brain Cogn. 2015, 95, 67–76. [Google Scholar] [CrossRef] [PubMed]
  150. Ullrich, S.; Kotz, S.A.; Schmidtke, D.S.; Aryani, A.; Conrad, M. Phonological iconicity electrifies: An ERP study on affective sound-to-meaning correspondences in german. Front. Psychol. 2016, 7. [Google Scholar] [CrossRef] [PubMed]
  151. Zhang, L.; Peng, W.W.; Chen, J.; Hu, L. Electrophysiological evidences demonstrating differences in brain functions between nonmusicians and musicians. Sci. Rep. 2015, 5. [Google Scholar] [CrossRef] [PubMed]
  152. Kanske, P.; Schonfelder, S.; Wessa, M. Emotional modulation of the attentional blink and the relation to interpersonal reactivity. Front. Hum. Neurosci. 2013, 7. [Google Scholar] [CrossRef] [PubMed]
  153. del Giudice, R.; Blume, C.; Wislowska, M.; Lechinger, J.; Heib, D.P.J.; Pichler, G.; Donis, J.; Michitsch, G.; Gnjezda, M.T.; Chinchilla, M.; et al. Can self-relevant stimuli help assessing patients with disorders of consciousness? Conscious. Cogn. 2016, 44, 51–60. [Google Scholar] [CrossRef] [PubMed]
  154. Daly, I.; Malik, A.; Hwang, F.; Roesch, E.; Weaver, J.; Kirke, A.; Williams, D.; Miranda, E.; Nasuto, S.J. Neural correlates of emotional responses to music: An EEG study. Neurosci. Lett. 2014, 573, 52–57. [Google Scholar] [CrossRef] [PubMed]
  155. Leyh, R.; Heinisch, C.; Kungl, M.T.; Spangler, G. Attachment representation moderates the influence of emotional context on information processing. Front. Hum. Neurosci. 2016, 10. [Google Scholar] [CrossRef] [PubMed]
  156. Agrawal, D.; Thorne, J.D.; Viola, F.C.; Timm, L.; Debener, S.; Buchner, A.; Dengler, R.; Wittfoth, M. Electrophysiological responses to emotional prosody perception in cochlear implant users. Neuroimage Clin. 2013, 2, 229–238. [Google Scholar] [CrossRef] [PubMed]
  157. Hettich, D.T.; Bolinger, E.; Matuz, T.; Birbaumer, N.; Rosenstiel, W.; Spuler, M. EEG responses to auditory stimuli for automatic affect recognition. Front. Neurosci. 2016, 10. [Google Scholar] [CrossRef] [PubMed]
  158. Papousek, I.; Schulter, G.; Weiss, E.M.; Samson, A.C.; Freudenthaler, H.H.; Lackner, H.K. Frontal brain asymmetry and transient cardiovascular responses to the perception of humor. Biol. Psychol. 2013, 93, 114–121. [Google Scholar] [CrossRef] [PubMed]
  159. Reva, N.V.; Pavlov, S.V.; Loktev, K.V.; Korenyok, V.V.; Aftanas, L.I. Influence of long-term sahaja yoga meditation practice on emotional processing in the brain: An ERP study. Neuroscience 2014, 281, 195–201. [Google Scholar] [CrossRef] [PubMed]
  160. Conrad, M.; Recio, G.; Jacobs, A.M. The time course of emotion effects in first and second language processing: A cross cultural ERP study with german-spanish bilinguals. Front. Psychol. 2011, 2. [Google Scholar] [CrossRef] [PubMed]
  161. Balconi, M.; Vanutelli, M.E. Vocal and visual stimulation, congruence and lateralization affect brain oscillations in interspecies emotional positive and negative interactions. Soc. Neurosci. 2016, 11, 297–310. [Google Scholar] [CrossRef] [PubMed]
  162. Liu, P.; Rigoulot, S.; Pell, M.D. Cultural differences in on-line sensitivity to emotional voices: Comparing east and west. Front. Hum. Neurosci. 2015, 9. [Google Scholar] [CrossRef] [PubMed]
  163. Gartner, M.; Bajbouj, M. Encoding-related EEG oscillations during memory formation are modulated by mood state. Soc. Cogn. Affect. Neurosci. 2014, 9, 1934–1941. [Google Scholar] [CrossRef] [PubMed]
  164. Fraedrich, E.M.; Lakatos, K.; Spangler, G. Brain activity during emotion perception: The role of attachment representation. Attach. Hum. Dev. 2010, 12, 231–248. [Google Scholar] [CrossRef] [PubMed]
  165. Kochel, A.; Leutgeb, V.; Schienle, A. Affective inhibitory control in adults with attention deficit hyperactivity disorder: Abnormalities in electrocortical late positivity. Neurosci. Lett. 2012, 530, 47–52. [Google Scholar] [CrossRef] [PubMed]
  166. Groch, S.; Wilhelm, I.; Diekelmann, S.; Born, J. The role of rem sleep in the processing of emotional memories: Evidence from behavior and event-related potentials. Neurobiol. Learn. Memory 2013, 99, 1–9. [Google Scholar] [CrossRef] [PubMed]
  167. Ruchsow, M.; Groen, G.; Kiefer, M.; Buchheim, A.; Walter, H.; Martius, P.; Reiter, M.; Hermle, L.; Spitzer, M.; Ebert, D.; et al. Response inhibition in borderline personality disorder: Event-related potentials in a go/nogo task. J. Neural Transm. 2008, 115, 127–133. [Google Scholar] [CrossRef] [PubMed]
  168. Kuhnpast, N.; Gramann, K.; Pollatos, O. Electrophysiologic evidence for multilevel deficits in emotional face processing in patients with bulimia nervosa. Psychosom. Med. 2012, 74, 736–744. [Google Scholar] [CrossRef] [PubMed]
  169. Missana, M.; Grossmann, T. Infants’ emerging sensitivity to emotional body expressions: Insights from asymmetrical frontal brain activity. Dev. Psychol. 2015, 51, 151–160. [Google Scholar] [CrossRef] [PubMed]
  170. Pollatos, O.; Kirsch, W.; Schandry, R. On the relationship between interoceptive awareness, emotional experience, and brain processes. Cogn. Brain Res. 2005, 25, 948–962. [Google Scholar] [CrossRef] [PubMed]
  171. Herbert, B.M.; Pollatos, O.; Schandry, R. Interoceptive sensitivity and emotion processing: An EEG study. Int. J. Psychophysiol. 2007, 65, 214–227. [Google Scholar] [CrossRef] [PubMed]
  172. del Giudice, R.; Lechinger, J.; Wislowska, M.; Heib, D.P.J.; Hoedlmoser, K.; Schabus, M. Oscillatory brain responses to own names uttered by unfamiliar and familiar voices. Brain Res. 2014, 1591, 63–73. [Google Scholar] [CrossRef] [PubMed]
  173. Hidalgo-Munoz, A.R.; Lopez, M.M.; Santos, I.M.; Pereira, A.T.; Vazquez-Marrufo, M.; Galvao-Carmona, A.; Tome, A.M. Application of svm-rfe on EEG signals for detecting the most relevant scalp regions linked to affective valence processing. Expert Syst. Appl. 2013, 40, 2102–2108. [Google Scholar] [CrossRef]
  174. Hidalgo-Munoz, A.R.; Lopez, M.M.; Pereira, A.T.; Santos, I.M.; Tome, A.M. Spectral turbulence measuring as feature extraction method from EEG on affective computing. Biomed. Signal Process. Control 2013, 8, 945–950. [Google Scholar] [CrossRef]
  175. Utama, N.P.; Takemoto, A.; Koike, Y.; Nakamura, K. Phased processing of facial emotion: An ERP study. Neurosci. Res. 2009, 64, 30–40. [Google Scholar] [CrossRef] [PubMed]
  176. Estepp, J.R.; Christensen, J.C. Electrode replacement does not affect classification accuracy in dual-session use of a passive brain-computer interface for assessing cognitive workload. Front. Neurosci. 2015, 9. [Google Scholar] [CrossRef] [PubMed]
  177. Gasbarri, A.; Arnone, B.; Pompili, A.; Pacitti, F.; Pacitti, C.; Cahill, L. Sex-related hemispheric lateralization of electrical potentials evoked by arousing negative stimuli. Brain Res. 2007, 1138, 178–186. [Google Scholar] [CrossRef] [PubMed]
  178. Jessen, S.; Obleser, J.; Kotz, S.A. How bodies and voices interact in early emotion perception. PLoS ONE 2012, 7, e36070. [Google Scholar] [CrossRef] [PubMed]
  179. Jessen, S.; Kotz, S.A. The temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage 2011, 58, 665–674. [Google Scholar] [CrossRef] [PubMed]
  180. Liu, T.S.; Pinheiro, A.P.; Zhao, Z.X.; Nestor, P.G.; McCarley, R.W.; Niznikiewicz, M. Simultaneous face and voice processing in schizophrenia. Behav. Brain Res. 2016, 305, 76–86. [Google Scholar] [CrossRef] [PubMed]
  181. Heutink, J.; Brouwer, W.H.; de Jong, B.M.; Bouma, A. Conscious and unconscious processing of fear after right amygdala damage: A single case ERP-study. Neurocase 2011, 17, 297–312. [Google Scholar] [CrossRef] [PubMed]
  182. Carretie, L.; Hinojosa, J.A.; Albert, J.; Mercado, F. Neural response to sustained affective visual stimulation using an indirect task. Exp. Brain Res. 2006, 174, 630–637. [Google Scholar] [CrossRef] [PubMed]
  183. Conrad, N.J.; Schmidt, L.A.; Niccols, A.; Polak, C.P.; Riniolo, T.C.; Burack, J.A. Frontal electroencephalogram asymmetry during affective processing in children with down syndrome: A pilot study. J. Intell. Disabil. Res. 2007, 51, 988–995. [Google Scholar] [CrossRef] [PubMed]
  184. Santesso, D.L.; Reker, D.L.; Schmidt, L.A.; Segalowitz, S.J. Frontal electroencephalogram activation asymmetry, emotional intelligence, and externalizing behaviors in 10-year-old children. Child Psychiatry Hum. Dev. 2006, 36, 311–328. [Google Scholar] [CrossRef] [PubMed]
  185. Petrantonakis, P.C.; Hadjileontiadis, L.J. Adaptive emotional information retrieval from EEG signals in the time-frequency domain. IEEE Trans. Signal Process. 2012, 60, 2604–2616. [Google Scholar] [CrossRef]
  186. Zhang, C.; Tong, L.; Zeng, Y.; Jiang, J.F.; Bu, H.B.; Yan, B.; Li, J.X. Automatic artifact removal from electroencephalogram data based on a priori artifact information. Biomed. Res. Int. 2015. [Google Scholar] [CrossRef] [PubMed]
  187. Jin, J.; Allison, B.Z.; Kaufmann, T.; Kubler, A.; Zhang, Y.; Wang, X.Y.; Cichocki, A. The changing face of p300 bcis: A comparison of stimulus changes in a p300 bci involving faces, emotion, and movement. PLoS ONE 2012, 7. [Google Scholar] [CrossRef] [PubMed]
  188. Muhl, C.; Jeunet, C.; Lotte, F. EEG-based workload estimation across affective contexts. Front. Neurosci. 2014, 8. [Google Scholar] [CrossRef] [Green Version]
  189. Petrantonakis, P.C.; Hadjileontiadis, L.J. A novel emotion elicitation index using frontal brain asymmetry for enhanced EEG-based emotion recognition. IEEE Trans. Inf. Technol. Biomed. 2011, 15, 737–746. [Google Scholar] [CrossRef] [PubMed]
  190. Naji, M.; Firoozabadi, M.; Azadfallah, P. Emotion classification during music listening from forehead biosignals. Signal Image Video Process. 2015, 9, 1365–1375. [Google Scholar] [CrossRef]
  191. Yano, K.; Suyama, T. A novel fixed low-rank constrained EEG spatial filter estimation with application to movie-induced emotion recognition. Comput. Intell. Neurosci. 2016. [Google Scholar] [CrossRef] [PubMed]
  192. Zhou, F.; Qu, X.D.; Jiao, J.X.; Helander, M.G. Emotion prediction from physiological signals: A comparison study between visual and auditory elicitors. Interact. Comput. 2014, 26, 285–302. [Google Scholar] [CrossRef]
  193. Wang, S.F.; Zhu, Y.C.; Yue, L.H.; Ji, Q. Emotion recognition with the help of privileged information. IEEE Trans. Auton. Ment. Dev. 2015, 7, 189–200. [Google Scholar] [CrossRef]
  194. Zhang, X.W.; Hu, B.; Chen, J.; Moore, P. Ontology-based context modeling for emotion recognition in an intelligent web. World Wide Web Internet Web Inf. Syst. 2013, 16, 497–513. [Google Scholar] [CrossRef]
  195. Li, C.; Feng, Z.Y.; Xu, C. Error-correcting output codes for multi-label emotion classification. Multimed. Tools Appl. 2016, 75, 14399–14416. [Google Scholar] [CrossRef]
  196. Zhang, Y.; Ji, X.M.; Zhang, S.H. An approach to EEG-based emotion recognition using combined feature extraction method. Neurosci. Lett. 2016, 633, 152–157. [Google Scholar] [CrossRef] [PubMed]
  197. Zhang, J.H.; Chen, M.; Zhao, S.K.; Hu, S.Q.; Shi, Z.G.; Cao, Y. Relieff-based EEG sensor selection methods for emotion recognition. Sensors 2016, 16. [Google Scholar] [CrossRef] [PubMed]
  198. Yoon, H.J.; Chung, S.Y. EEG-based emotion estimation using bayesian weighted-log-posterior function and perceptron convergence algorithm. Comput. Biol. Med. 2013, 43, 2230–2237. [Google Scholar] [CrossRef] [PubMed]
  199. Jirayucharoensak, S.; Pan-Ngum, S.; Israsena, P. EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. 2014. [Google Scholar] [CrossRef] [PubMed]
  200. Atkinson, J.; Campos, D. Improving bci-based emotion recognition by combining EEG feature selection and kernel classifiers. Expert Syst. Appl. 2016, 47, 35–41. [Google Scholar] [CrossRef]
  201. Garcia-Martinez, B.; Martinez-Rodrigo, A.; Cantabrana, R.Z.; Garcia, J.M.P.; Alcaraz, R. Application of entropy-based metrics to identify emotional distress from electroencephalographic recordings. Entropy 2016, 18, 221. [Google Scholar] [CrossRef]
  202. Gupta, R.; Laghari, K.U.R.; Falk, T.H. Relevance vector classifier decision fusion and EEG graph-theoretic features for automatic affective state characterization. Neurocomputing 2016, 174, 875–884. [Google Scholar] [CrossRef]
  203. Padilla-Buritica, J.I.; Martinez-Vargas, J.D.; Castellanos-Dominguez, G. Emotion discrimination using spatially compact regions of interest extracted from imaging EEG activity. Front. Comput. Neurosci. 2016, 10. [Google Scholar] [CrossRef] [PubMed]
  204. Chen, J.; Hu, B.; Moore, P.; Zhang, X.W.; Ma, X. Electroencephalogram-based emotion assessment system using ontology and data mining techniques. Appl. Soft Comput. 2015, 30, 663–674. [Google Scholar] [CrossRef]
  205. Daimi, S.N.; Saha, G. Classification of emotions induced by music videos and correlation with participants’ rating. Expert Syst. Appl. 2014, 41, 6057–6065. [Google Scholar] [CrossRef]
  206. Chai, X.; Wang, Q.S.; Zhao, Y.P.; Liu, X.; Bai, O.; Li, Y.Q. Unsupervised domain adaptation techniques based on auto-encoder for non-stationary EEG-based emotion recognition. Comput. Biol. Med. 2016, 79, 205–214. [Google Scholar] [CrossRef] [PubMed]
  207. Kortelainen, J.; Vayrynen, E.; Seppanen, T. High-frequency electroencephalographic activity in left temporal area is associated with pleasant emotion induced by video clips. Comput. Intell. Neurosci. 2015, 2015. [Google Scholar] [CrossRef] [PubMed]
  208. Soleymani, M.; Asghari-Esfeden, S.; Fu, Y.; Pantic, M. Analysis of EEG signals and facial expressions for continuous emotion detection. IEEE Trans. Affect. Comput. 2016, 7, 17–28. [Google Scholar] [CrossRef]
  209. Goshvarpour, A.; Abbasi, A. Dynamical analysis of emotional states from electroencephalogram signals. Biomed. Eng. Appl. Basis Commun. 2016, 28. [Google Scholar] [CrossRef]
  210. Dennis, T.A.; Malone, M.M.; Chen, C.C. Emotional face processing and emotion regulation in children: An ERP study. Dev. Neuropsychol. 2009, 34, 85–102. [Google Scholar] [CrossRef] [PubMed]
  211. Marsella, P.; Scorpecci, A.; Vecchiato, G.; Maglione, A.G.; Colosimo, A.; Babiloni, F. Neuroelectrical imaging investigation of cortical activity during listening to music in prelingually deaf children with cochlear implants. Int. J. Pediatr. Otorhinolaryngol. 2014, 78, 737–743. [Google Scholar] [CrossRef] [PubMed]
  212. Tessier, S.; Lambert, A.; Scherzer, P.; Jemel, B.; Godbout, R. Rem sleep and emotional face memory in typically-developing children and children with autism. Biol. Psychol. 2015, 110, 107–114. [Google Scholar] [CrossRef] [PubMed]
  213. Khosrowabadi, R.; Quek, C.; Ang, K.K.; Wahab, A.; Chen, S.H.A. Dynamic screening of autistic children in various mental states using pattern of connectivity between brain regions. Appl. Soft Comput. 2015, 32, 335–346. [Google Scholar] [CrossRef]
  214. Matiko, J.W.; Wei, Y.; Torah, R.; Grabham, N.; Paul, G.; Beeby, S.; Tudor, J. Wearable EEG headband using printed electrodes and powered by energy harvesting for emotion monitoring in ambient assisted living. Smart Mater. Struct. 2015, 24, 1–11. [Google Scholar] [CrossRef]
  215. Lomas, T.; Edginton, T.; Cartwright, T.; Ridge, D. Men developing emotional intelligence through meditation? Integrating narrative, cognitive and electroencephalography (EEG) evidence. Psychol. Men Masc. 2014, 15, 213–224. [Google Scholar] [CrossRef]
  216. Bhatti, A.M.; Majid, M.; Anwar, S.M.; Khan, B. Human emotion recognition and analysis in response to audio music using brain signals. Comput. Hum. Behav. 2016, 65, 267–275. [Google Scholar] [CrossRef]
  217. Aydin, S.; Demirtas, S.; Ates, K.; Tunga, M.A. Emotion recognition with eigen features of frequency band activities embedded in induced brain oscillations mediated by affective pictures. Int. J. Neural Syst. 2016, 26. [Google Scholar] [CrossRef] [PubMed]
  218. Chew, L.H.; Teo, J.; Mountstephens, J. Aesthetic preference recognition of 3d shapes using EEG. Cogn. Neurodyn. 2016, 10, 165–173. [Google Scholar] [CrossRef] [PubMed]
  219. Peng, Y.; Lu, B.L. Discriminative manifold extreme learning machine and applications to image and EEG signal classification. Neurocomputing 2016, 174, 265–277. [Google Scholar] [CrossRef]
  220. Thammasan, N.; Moriyama, K.; Fukui, K.; Numao, M. Continuous music-emotion recognition based on electroencephalogram. IEICE Trans. Inf. Syst. 2016, E99D, 1234–1241. [Google Scholar] [CrossRef]
  221. Chiu, H.C.; Lin, Y.H.; Lo, M.T.; Tang, S.C.; Wang, T.D.; Lu, H.C.; Ho, Y.L.; Ma, H.P.; Peng, C.K. Complexity of cardiac signals for predicting changes in alpha-waves after stress in patients undergoing cardiac catheterization. Sci. Rep. 2015, 5. [Google Scholar] [CrossRef] [PubMed]
  222. Daly, I.; Williams, D.; Hallowell, J.; Hwang, F.; Kirke, A.; Malik, A.; Weaver, J.; Miranda, E.; Nasuto, S.J. Music-induced emotions can be predicted from a combination of brain activity and acoustic features. Brain Cogn. 2015, 101, 1–11. [Google Scholar] [CrossRef] [PubMed]
  223. Di, G.Q.; Wu, S.X. Emotion recognition from sound stimuli based on back-propagation neural networks and electroencephalograms. J. Acoust. Soc.Am. 2015, 138, 994–1002. [Google Scholar] [CrossRef] [PubMed]
  224. Georgieva, O.; Milanov, S.; Georgieva, P.; Santos, I.M.; Pereira, A.T.; Silva, C.F. Learning to decode human emotions from event-related potentials. Neural Comput. Appl. 2015, 26, 573–580. [Google Scholar] [CrossRef]
  225. Islam, M.; Ahmed, T.; Yusuf, M.S.U.; Ahmad, M. Cognitive state estimation by effective feature extraction and proper channel selection of EEG signal. J. Circuits Syst. Comput. 2015, 24. [Google Scholar] [CrossRef]
  226. Bairy, G.M.; Niranjan, U.C.; Puthankattil, S.D. Automated classification of depression EEG signals using wavelet entropies and energies. J. Mech. Med. Biol. 2016, 16. [Google Scholar] [CrossRef]
  227. Lamti, H.A.; Ben Khelifa, M.M.; Alimi, A.M.; Gorce, P. Emotion detection for wheelchair navigation enhancement. Robotica 2016, 34, 1209–1226. [Google Scholar] [CrossRef]
  228. Bairy, G.M.; Bhat, S.; Eugene, L.W.J.; Niranjan, U.C.; Puthankatti, S.D.; Joseph, P.K. Automated classification of depression electroencephalographic signals using discrete cosine transform and nonlinear dynamics. J. Med. Imaging Health Inform. 2015, 5, 635–640. [Google Scholar] [CrossRef]
  229. Bozhkov, L.; Koprinkova-Hristova, P.; Georgieva, P. Learning to decode human emotions with echo state networks. Neural Netw. 2016, 78, 112–119. [Google Scholar] [CrossRef] [PubMed]
  230. Hu, Y.; Jiang, Y.B.; Hu, P.P.; Ma, H.J.; Wang, K. Impaired social cognition in patients with interictal epileptiform discharges in the frontal lobe. Epilepsy Behav. 2016, 57, 46–54. [Google Scholar] [CrossRef] [PubMed]
  231. Zhang, Y.; Wang, C.F.; Sun, C.C.; Zhang, X.; Wang, Y.J.; Qi, H.Z.; He, F.; Zhao, X.; Wan, B.K.; Du, J.G.; et al. Neural complexity in patients with poststroke depression: A resting eeg study. J. Affect. Disord. 2015, 188, 310–318. [Google Scholar] [CrossRef] [PubMed]
  232. Korrchoubey, B.; Kaiser, J.; Bostanov, V.; Lutzenberger, W.; Birbaumer, N. Recognition of affective prosody in brain-damaged patients and healthy controls: A neurophysiological study using EEG and whole-head meg. Cogn. Affect. Behav. Neurosci. 2009, 9, 153–167. [Google Scholar] [CrossRef] [PubMed]
  233. Brenner, C.A.; Rumak, S.P.; Burns, A.M.N. Facial emotion memory in schizophrenia: From encoding to maintenance-related EEG. Clin. Neurophysiol. 2016, 127, 1366–1373. [Google Scholar] [CrossRef] [PubMed]
  234. Meletti, S.; Cantalupo, G.; Santoro, F.; Benuzzi, F.; Marliani, A.F.; Tassinari, C.A.; Rubboli, G. Temporal lobe epilepsy and emotion recognition without amygdala: A case study of urbach-wiethe disease and review of the literature. Epileptic Disord. 2014, 16, 518–527. [Google Scholar] [PubMed]
  235. Papp, G.; Kovac, S.; Frese, A.; Evers, S. The impact of temporal lobe epilepsy on musical ability. Seizure Eur. J. Epilepsy 2014, 23, 533–536. [Google Scholar] [CrossRef] [PubMed]
  236. Akbarfahimi, M.; Tehrani-Doost, M.; Ghassemi, F. Emotional face perception in patients with schizophrenia: An event-related potential study. Neurophysiology 2013, 45, 249–257. [Google Scholar] [CrossRef]
  237. Frantzidis, C.A.; Bratsas, C.; Klados, M.A.; Konstantinidis, E.; Lithari, C.D.; Vivas, A.B.; Papadelis, C.L.; Kaldoudi, E.; Pappas, C.; Bamidis, P.D. On the classification of emotional biosignals evoked while viewing affective pictures: An integrated data-mining-based approach for healthcare applications. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 309–318. [Google Scholar] [CrossRef] [PubMed]
  238. Maglione, A.G.; Scorpecci, A.; Malerba, P.; Marsella, P.; Giannantonio, S.; Colosimo, A.; Babiloni, F.; Vecchiato, G. Alpha EEG frontal asymmetries during audiovisual perception in cochlear implant users a study with bilateral and unilateral young users. Methods Inf. Med. 2015, 54, 500–504. [Google Scholar] [CrossRef] [PubMed]
  239. Gonzalez-Roldan, A.M.; Martinez-Jauand, M.; Munoz-Garcia, M.A.; Sitges, C.; Cifre, I.; Montoya, P. Temporal dissociation in the brain processing of pain and anger faces with different intensities of emotional expression. Pain 2011, 152, 853–859. [Google Scholar] [CrossRef] [PubMed]
  240. Pollatos, O.; Gramann, K. Electrophysiological evidence of early processing deficits in alexithymia. Biol. Psychol. 2011, 87, 113–121. [Google Scholar] [CrossRef] [PubMed]
  241. Eskenazi, P.I.; Hartmann, F.G.H.; Rietdijk, W.J.R. Why controllers compromise on their fiduciary duties: EEG evidence on the role of the human mirror neuron system. Account. Organ. Soc. 2016, 50, 41–50. [Google Scholar] [CrossRef]
  242. Leventon, J.S.; Bauer, P.J. Emotion regulation during the encoding of emotional stimuli: Effects on subsequent memory. J. Exp. Child Psychol. 2016, 142, 312–333. [Google Scholar] [CrossRef] [PubMed]
  243. Jessen, S.; Grossmann, T. Neural signatures of conscious and unconscious emotional face processing in human infants. Cortex 2015, 64, 260–270. [Google Scholar] [CrossRef] [PubMed]
  244. Amd, M.; Barnes-Holmes, D.; Ivanoff, J. A derived transfer of eliciting emotional functions using differences among electroencephalograms as a dependent measure. J. Exp. Anal. Behav. 2013, 99, 318–334. [Google Scholar] [CrossRef] [PubMed]
  245. Flaisch, T.; Hacker, F.; Renner, B.; Schupp, H.T. Emotion and the processing of symbolic gestures: An event-related brain potential study. Soc. Cogn. Affect. Neurosci. 2011, 6, 109–118. [Google Scholar] [CrossRef] [PubMed]
  246. Herbert, C.; Herbert, B.M.; Ethofer, T.; Pauli, P. His or mine? The time course of self-other discrimination in emotion processing. Soc. Neurosci. 2011, 6, 277–288. [Google Scholar] [CrossRef] [PubMed]
  247. Babiloni, C.; Vecchio, F.; Buffo, P.; Buttiglione, M.; Cibelli, G.; Rossini, P.M. Cortical responses to consciousness of schematic emotional facial expressions: A high-resolution EEG study. Hum. Brain Mapp. 2010, 31, 1556–1569. [Google Scholar] [CrossRef] [PubMed]
  248. Balconi, M.; Mazza, G. Lateralisation effect in comprehension of emotional facial expression: A comparison between EEG alpha band power and behavioural inhibition (bis) and activation (bas) systems. Laterality 2010, 15, 361–384. [Google Scholar] [CrossRef] [PubMed]
  249. Schirmer, A.; Escoffier, N. Emotional mmn: Anxiety and heart rate correlate with the ERP signature for auditory change detection. Clin. Neurophysiol. 2010, 121, 53–59. [Google Scholar] [CrossRef] [PubMed]
  250. Wacker, J.; Chayanon, M.L.; Stemmler, G. Resting EEG signatures of agentic extraversion: New results and meta-analytic integration. J. Res. Personal. 2010, 44, 167–179. [Google Scholar] [CrossRef]
  251. Zhang, Q.; Lee, M. A hierarchical positive and negative emotion understanding system based on integrated analysis of visual and brain signals. Neurocomputing 2010, 73, 3264–3272. [Google Scholar] [CrossRef]
  252. Balconi, M.; Mazza, G. Brain oscillations and bis/bas (behavioral inhibition/activation system) effects on processing masked emotional cues. Ers/erd and coherence measures of alpha band. Int. J. Psychophysiol. 2009, 74, 158–165. [Google Scholar] [CrossRef] [PubMed]
  253. Khittl, B.; Bauer, H.; Walla, P. Change detection related to peripheral facial expression: An electroencephalography study. J. Neural Transm. 2009, 116, 67–70. [Google Scholar] [CrossRef] [PubMed]
  254. Morel, S.; Ponz, A.; Mercier, M.; Vuilleumier, P.; George, N. EEG-meg evidence for early differential repetition effects for fearful, happy and neutral faces. Brain Res. 2009, 1254, 84–98. [Google Scholar] [CrossRef] [PubMed]
  255. Balconi, M.; Pozzoli, U. Event-related oscillations (ERO) and event-related potentials (ERP) in emotional face recognition. Int. J. Neurosci. 2008, 118, 1412–1424. [Google Scholar] [CrossRef] [PubMed]
  256. Osaka, K.; Tsuchiya, S.; Ren, F.J.; Kuroiwa, S.; Tanioka, T.; Locsin, R.C. The technique of emotion recognition based on electroencephalogram. Inf. Int. Interdiscip. J. 2008, 11, 55–68. [Google Scholar]
  257. Muller, M.M.; Andersen, S.K.; Keil, A. Time course of competition for visual processing resources between emotional pictures and foreground task. Cereb. Cortex 2008, 18, 1892–1899. [Google Scholar] [CrossRef] [PubMed]
  258. Guntekin, B.; Basar, E. Emotional face expressions are differentiated with brain oscillations. Int. J. Psychophysiol. 2007, 64, 91–100. [Google Scholar] [CrossRef] [PubMed]
  259. Chen, X.H.; Pan, Z.H.; Wang, P.; Zhang, L.J.; Yuan, J.J. EEG oscillations reflect task effects for the change detection in vocal emotion. Cogn. Neurodyn. 2015, 9, 351–358. [Google Scholar] [CrossRef] [PubMed]
  260. Karran, A.J.; Fairclough, S.H.; Gilleade, K. A framework for psychophysiological classification within a cultural heritage context using interest. ACM Trans. Comput. Hum. Interact. 2015, 21. [Google Scholar] [CrossRef]
  261. Balconi, M.; Brambilla, E.; Falbo, L. Bis/bas, cortical oscillations and coherence in response to emotional cues. Brain Res. Bull. 2009, 80, 151–157. [Google Scholar] [CrossRef] [PubMed]
  262. Sirca, F.; Onorati, F.; Mainardi, L.; Russo, V. Time-varying spectral analysis of single-channel EEG: Application in affective protocol. J. Med. Biol. Eng. 2015, 35, 367–374. [Google Scholar] [CrossRef]
  263. Jie, X.; Rui, C.; Li, L. Emotion recognition based on the sample entropy of EEG. Bio-Med. Mater. Eng. 2014, 24, 1185–1192. [Google Scholar]
  264. Khosrowabadi, R.; Quek, C.; Ang, K.K.; Wahab, A. Ernn: A biologically inspired feedforward neural network to discriminate emotion from EEG signal. IEEE Trans. Neural Netw. Learn. Syst. 2014, 25, 609–620. [Google Scholar] [CrossRef] [PubMed]
  265. Lee, G.; Kwon, M.; Kavuri, S.; Lee, M. Action-perception cycle learning for incremental emotion recognition in a movie clip using 3d fuzzy gist based on visual and EEG signals. Integr. Comput. Aided Eng. 2014, 21, 295–310. [Google Scholar]
  266. Lee, G.; Kwon, M.; Sri, S.K.; Lee, M. Emotion recognition based on 3d fuzzy visual and EEG features in movie clips. Neurocomputing 2014, 144, 560–568. [Google Scholar] [CrossRef]
  267. Walter, S.; Kim, J.; Hrabal, D.; Crawcour, S.C.; Kessler, H.; Traue, H.C. Transsituational individual-specific biopsychological classification of emotions. IEEE Trans. Syst. Man Cybern. Syst. 2013, 43, 988–995. [Google Scholar] [CrossRef]
  268. Yeh, C.L.; Lee, P.L.; Chen, W.M.; Chang, C.Y.; Wu, Y.T.; Lan, G.Y. Improvement of classification accuracy in a phase-tagged steady-state visual evoked potential-based brain computer interface using multiclass support vector machine. Biomed. Eng. Online 2013, 12. [Google Scholar] [CrossRef] [PubMed]
  269. Murugappan, M.; Nagarajan, R.; Yaacob, S. Combining spatial filtering and wavelet transform for classifying human emotions using EEG signals. J. Med. Biol. Eng. 2011, 31, 45–51. [Google Scholar] [CrossRef]
  270. Frantzidis, C.A.; Bratsas, C.; Papadelis, C.L.; Konstantinidis, E.; Pappas, C.; Bamidis, P.D. Toward emotion aware computing: An integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 589–597. [Google Scholar] [CrossRef] [PubMed]
  271. Hosseini, S.A.; Khalilzadeh, M.A.; Changiz, S. Emotional stress recognition system for affective computing based on bio-signals. J. Biol. Syst. 2010, 18, 101–114. [Google Scholar] [CrossRef]
  272. Petrantonakis, P.C.; Hadjileontiadis, L.J. Emotion recognition from brain signals using hybrid adaptive filtering and higher order crossings analysis. IEEE Trans. Affect. Comput. 2010, 1, 81–97. [Google Scholar] [CrossRef]
  273. Wu, D.R.; Courtney, C.G.; Lance, B.J.; Narayanan, S.S.; Dawson, M.E.; Oie, K.S.; Parsons, T.D. Optimal arousal identification and classification for affective computing using physiological signals: Virtual reality stroop task. IEEE Trans. Affect. Comput. 2010, 1, 109–118. [Google Scholar] [CrossRef]
  274. Ko, K.E.; Yang, H.C.; Sim, K.B. Emotion recognition using EEG signals with relative power values and bayesian network. Int. J. Control Autom. Syst. 2009, 7, 865–870. [Google Scholar] [CrossRef]
  275. Petrantonakis, P.C.; Hadjileontiadis, L.J. Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 186–197. [Google Scholar] [CrossRef] [PubMed]
  276. Hoshi-Shiba, R.; Furukawa, K.; Okanoya, K. Neural correlates of expectation of musical termination structure or cadence. Neuroreport 2014, 25, 743–748. [Google Scholar] [CrossRef] [PubMed]
  277. Koelstra, S.; Patras, I. Fusion of facial expressions and EEG for implicit affective tagging. Image Vis. Comput. 2013, 31, 164–174. [Google Scholar] [CrossRef]
  278. Moon, J.; Kim, Y.; Lee, H.; Bae, C.; Yoon, W.C. Extraction of user preference for video stimuli using EEG-based user responses. ETRI J. 2013, 35, 1105–1114. [Google Scholar] [CrossRef]
  279. Lin, Y.P.; Wang, C.H.; Jung, T.P.; Wu, T.L.; Jeng, S.K.; Duann, J.R.; Chen, J.H. Eeg-based emotion recognition in music listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar] [PubMed]
  280. Horska, E.; Bercik, J.; Krasnodebski, A.; Matysik-Pejas, R.; Bakayova, H. Innovative approaches to examining consumer preferences when choosing wines. Agric. Econ. 2016, 62, 124–133. [Google Scholar] [CrossRef]
  281. Li, X.W.; Hu, B.; Zhao, Q.L.; Liu, L.; Peng, H.; Qi, Y.B.; Mao, C.S.; Fang, Z.; Liu, Q.Y. Improve affective learning with EEG approach. Comput. Inform. 2010, 29, 557–570. [Google Scholar]
  282. Dybala, P.; Ptaszynski, M.; Rzepka, R.; Araki, K. Evaluating subjective aspects of hci on an example of a non-task oriented conversational system. Int. J. Artif. Intell.Tools 2010, 19, 819–856. [Google Scholar] [CrossRef]
  283. Rothkrantz, L.J.M.; Horlings, R.; Dharmawan, Z. Recognition of emotional states of car drivers by EEG analysis. Neural Netw. World 2009, 19, 119–128. [Google Scholar]
  284. Littlefield, M. Constructing the organ of deceit the rhetoric of fmri and brain fingerprinting in post-9/11 america. Sci. Technol. Hum. Values 2009, 34, 365–392. [Google Scholar] [CrossRef]
  285. Al-Hudhud, G. Affective command-based control system integrating brain signals in commands control systems. Comput. Hum. Behav. 2014, 30, 535–541. [Google Scholar] [CrossRef]
Figure 1. Gartner Hype Cycle for Emerging Technologies 2016; Brain-computer interface and affective computing are noted by the arrows (Source: Gartner’s 2016 Hype Cycle for Emerging Technologies, http://www.gartner.com/newsroom/id/3412017).
Figure 1. Gartner Hype Cycle for Emerging Technologies 2016; Brain-computer interface and affective computing are noted by the arrows (Source: Gartner’s 2016 Hype Cycle for Emerging Technologies, http://www.gartner.com/newsroom/id/3412017).
Applsci 07 01239 g001
Figure 2. Procedure that we used to extract and filter articles.
Figure 2. Procedure that we used to extract and filter articles.
Applsci 07 01239 g002
Figure 3. Classification scheme.
Figure 3. Classification scheme.
Applsci 07 01239 g003
Figure 4. Classification of articles by field and publication year.
Figure 4. Classification of articles by field and publication year.
Applsci 07 01239 g004
Figure 5. Treemap classification of articles by application domain and field ( Applsci 07 01239 i001 non-medical Applsci 07 01239 i002 medical).
Figure 5. Treemap classification of articles by application domain and field ( Applsci 07 01239 i001 non-medical Applsci 07 01239 i002 medical).
Applsci 07 01239 g005
Figure 6. Challenges and future directions.
Figure 6. Challenges and future directions.
Applsci 07 01239 g006
Table 1. Review article classifications.
Table 1. Review article classifications.
FocusDescriptionReferences
General/backgroundReview recent studies that investigate the recognition of affective states from EEG signals. Aim to present a general discussion on one or more aspects, such as the neuroimaging techniques, emotion representation models, physiological signals, stimulus, feature extraction, and classification, and discuss their current achievements and applications.Liberati et al., 2015 [7],
Verma et al., 2014 [8],
Rule et al., 2013 [9],
Keysers and Fadiga, 2008 [10],
Grossmann and Johnson, 2007 [11],
Muthukumaraswamy and Johnson, 2007 [12],
Schupp et al., 2006 [3]
Signal ProcessingSurvey the recent developments in the field of EEG signal processing, including filtering and artifact processing, signal enhancement methods, feature extraction methods, and channel selection methods.Alotaiby et al., 2015 [13],
Jenke et al., 2014 [14],
Knyazev, 2007 [15]
ClassificationSurvey the recent developments in the field of machine learning, including the classification methods, performance evaluation approaches, and post-processing methods.Kim et al., 2013 [16]
ApplicationMedicalSchizophreniaIsaac et al., 2016 [17],
Campos et al., 2016 [18]
Disorders of consciousnessHarrison et al., 2013 [19]
DepressionAcharya et al., 2015 [20]
AutismBhat et al., 2014 [21]
Non-medicalGamesBontchev and Boyan, 2016 [22]
Traffic SafetyReyes-Munoz et al., 2016 [23]
OtherInclude a review of the training protocols, validity in the EEG-neurofeedback optimal performance field, and evidence of neurofeedback learning.Gruzelier et al., 2014 [24]
Harmon-Jones et al., 2009 [25]
Table 2. Emotion elicitation techniques.
Table 2. Emotion elicitation techniques.
TechniqueNumber of ArticlesDomain (Medical, Non-Medical)
Visual-based elicitation using images 8826% , 73.9%
Prepared task4325.6%, 47.4%
Audio-visual elicitation using short film video clips 3818.4%, 81.6%
Audio-based elicitation using music 2917.2%, 82.8%
Multiple techniques1926.3%, 73.9%
Other1711.7%, 88.2%
Imagination techniques/memory recall 1020%, 80%
Social interactions425%, 75%
Table 3. Electroencephalography (EEG) devices.
Table 3. Electroencephalography (EEG) devices.
EEG DeviceNumber of ArticlesReferences
Quik-cap, NuAmps (Compumedics NeuroScan Inc., El Paso, TX, USA)33[4,28,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77]
Active-electrodes (BioSemi Inc., Amsterdam, Netherlands)28[5,6,34,35,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101]
EPOC (Emotiv Inc., San Francisco, CA, USA)24[2,39,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123]
Geodesic Sensor Net (Electrical Geodesics Inc., Eugene, OR, USA)22[26,30,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143]
actiCAP, EASYCAP, BrainCap (Brain Products Inc., Munich, Germany) 22[29,31,44,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162]
EasyCap (FMS, Herrsching-Breitbrunn, Germany)15[27,37,163,164,165,166,167,168,169,170,171,172,173,174,175]
Electro-Cap (Electro-Cap International Inc., Eaton, OH, USA) 9[176,177,178,179,180,181,182,183,184]
g.MOBIlab, g.EEGcap (g.tec Guger Technologies Inc., Graz, Austria)7[33,45,185,186,187,188,189]
Table 4. Benchmark EEG emotional databases.
Table 4. Benchmark EEG emotional databases.
DatasetDescriptionReferences
DEAPIt is a multimodal dataset for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute-long excerpts of music videos. Participants rated each video in terms of the levels of arousal, valence, like/dislike, dominance, and familiarity. For 22 of the 32 participants, frontal face video was also recorded. The database is described in [92] and is available online at http://www.eecs.qmul.ac.uk/mmv/datasets/deap/.[92,194,195,196,197,198,199,200,201,202,203,204,205]
SEEDThe EEG signals of 15 subjects were recorded while they were watching emotional film clips. For the feedback, participants were told to report their emotional reactions to each film clip by completing a questionnaire immediately after watching each clip. To investigate neural signatures and stable patterns across sessions and individuals, each subject was required to perform the experiments for three sessions. The time interval between two sessions was one week or longer. Facial videos and EEG data were recorded simultaneously. The database is available online at http://bcmi.sjtu.edu.cn/~seed/download.html.[206]
MAHNOBIt is a multimodal database for emotion recognition and implicit tagging. It includes the physiological signals from 27 participants in response to 20 stimulus videos. Subjects’ emotional self-assessments are nine-scale evaluations, from 1 to 9, for both valence and arousal. The database is described in [101] and is available online at http://www.ibug.doc.ic.ac.uk/resources/mahnob-hci-tagging-database/.[41,207,208]
eNTERFACE06_EMOBRAIN It is a multimodal database for emotion recognition. It contains physiological signals from both the peripheral (galvanic skin response, respiration, and blood volume pressure) and central (EEG and frontal fNIRS) nervous systems from five subjects in response to picture stimuli. The database is available online at http://www.enterface.net/enterface06/docs/results/databases/eNTERFACE06_EMOBRAIN.[209]
Table 5. Computational methods to extract and classify emotional features from EEG.
Table 5. Computational methods to extract and classify emotional features from EEG.
MethodNumber of ArticlesReferences
Feature extractionFrequency domain: power spectral density, band power >> using Fourier Transform 29[26,49,63,74,104,105,107,116,123,149,161,169,176,193,197,200,203,204,207,208,216,218,219,220,221,222,223,224,225]
Time domain: Activity, mobility and complexity >> using Hjorth Parameters, Fractal dimension >> using Higuchi Method11[107,117,200,203,204,206,213,216,220,222,224]
Wavelet domain: Entropy, Energy >> using Wavelet Transform7[186,201,203,213,216,217,226]
Statistical features: Median, Standard deviation, Kurtosis symmetry, etc. 6[6,104,117,200,204,226]
ClassificationSupport Vector Machine (SVM)24[49,104,106,107,116,117,157,176,186,190,193,196,197,202,203,204,213,216,218,220,225,226,227,228]
K-Nearest Neighbor (k-NN)10[49,104,107,190,204,207,213,216,218,228]
Linear Discriminant Analysis (LDA)4[26,123,176,227]
Artificial Neural Network (ANN)7[105,176,190,204,216,223,227]
Table 6. Medical application classification.
Table 6. Medical application classification.
DomainDescriptionReferences
AssessmentThese articles assess the performance in terms of health and non-health. The data from these articles can be used in evaluating severity level and monitoring progress and attainment. Moreover, information from these studies is used to develop a scoring system to assess emotionality.[4,47,72,79,94,106,118,119,120,133,135,145,153,167,212,213,230,231,232,233,234,235]
AssistanceThese articles provide insight into technologies/resources for a patient with a disorder/learning problem by means of assistive tools. Assistance occurs after the emotion detection methods have been used to identify skills and limitations of potential users.[6,33,90,103,125,140,146,165,168,180,221,236,237]
DiagnosisThese articles describe how doctors use EEG in the interpretation of medical conditions. EEG is used by clinicians as a diagnostic tool for patients with psychiatric and neurological disorders. EEG studies produce conclusive results as per the symptoms experienced by the patients. Most of these symptoms are emotional; hence, they are difficult to diagnose using subjective means. The goal of these systems is to objectively detect medical anomalies in a patient’s emotional affective state.[20,62,66,76,137,183,226,228]
MonitoringThese articles outline the performances of patients who are monitored during emotion processing to understand the deficits in emotion and cognition. The data are used to understand neural activity and correlates of emotions in patients with different conditions/disorders using different stimulus types. [5,52,54,73,75,81,98,99,107,114,121,122,124,126,156,181,211,238,239,240]
OtherThese are review studies that report previous research efforts. They serve as avenues for increasing the awareness of EEG emotional response studies. They also increase the awareness of how EEG may be used in clinical practice to uncover potential neurophysiologic abnormalities.[17,18,19,21]
Table 7. Non-medical application classification.
Table 7. Non-medical application classification.
DomainDescriptionReferences
MonitoringThese articles explored the effects of different types of stimuli, test emotions, and elicitation methods. They also investigated how different types of stimuli induce specific emotional reactions. They identified efforts in brain lateralization, which aims to define regions of the brain and the functioning of specific behaviors and cognitive skills. They also compared emotional responses between genders and during human developmental stages.[26,27,28,29,30,31,32,35,43,45,46,48,51,55,56,59,60,64,65,67,68,69,70,74,77,78,80,82,83,84,85,86,87,89,91,93,96,97,113,127,128,129,132,134,138,139,141,142,143,144,147,148,150,155,158,159,161,163,164,166,169,170,171,172,175,177,178,179,182,184,187,192,210,215,241,242,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260,261]
New methodThese articles proposed approaches for detecting affective states using single/multi-modality signal processing methods, including feature extraction and selection, machine learning and pattern recognition methods. These proposed systems aim to explore or improve EEG-based emotion recognition systems.[2,39,41,42,49,50,57,61,63,92,104,108,109,117,131,136,149,152,157,173,174,185,186,189,191,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,217,219,223,224,225,229,262,263,264,265,266,267,268,269,270,271,272,273,274,275]
EntertainmentThese articles observed relationships between multimedia data (music/video) and human emotions. For example, they explored the effects of different types of music on subjects of different ages or genders. In gaming research, some articles sought to detect gamers’ affective states to adapt to specific game features, such as the levels of difficulty, punishment, and encouragement. All of these were investigated using EEG-based emotion recognition systems.[34,37,44,53,58,71,88,100,101,110,111,112,115,116,151,154,190,193,216,220,222,276,277,278,279]
MarketingThese articles sought to understand consumer responses toward marketing stimuli by using imaging techniques and recognition of neurophysiological parameters.[102,123,218,280]
EducationThese articles tracked students’ engagement and learning.[38,281]
AssistanceThese articles explored how assistive technologies or learning resources were provided to individuals. Thereafter, they were used to identify skills, user experience, and limitations of potential users. They were used to improve behavior, cognition, and emotion regulation.[1,40,95,105,139,194,214,227,282,283]
OthersOther articles explored different aspects, such as workload;[176,188,284]
social interaction and cultural differences;[9,36,130,160,162]
review studies that reported previous research efforts.[3,7,8,10,11,12,13,14,15,16,22,23,24,25]

Share and Cite

MDPI and ACS Style

Al-Nafjan, A.; Hosny, M.; Al-Ohali, Y.; Al-Wabil, A. Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. Appl. Sci. 2017, 7, 1239. https://doi.org/10.3390/app7121239

AMA Style

Al-Nafjan A, Hosny M, Al-Ohali Y, Al-Wabil A. Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. Applied Sciences. 2017; 7(12):1239. https://doi.org/10.3390/app7121239

Chicago/Turabian Style

Al-Nafjan, Abeer, Manar Hosny, Yousef Al-Ohali, and Areej Al-Wabil. 2017. "Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review" Applied Sciences 7, no. 12: 1239. https://doi.org/10.3390/app7121239

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop