Keywords

1 Introduction: Content Personalization as a Key Strategic Element

According to the definition given by Osterwalder and Pigneur (2002), business strategy represents an approach based on the profiles of actors, their dynamics and roles within a specific business model that determines certain operational-level steps necessary to be undertaken to implement the entire business strategy. In the context of e-Commerce, information strategy constitutes the vital part of the overall business strategy as it aims at discovering new and profitable insights to excel customer relationship and enhance customer satisfaction (Osterwalder and Pigneur 2002). The deep analysis of the customers’ preferences and behaviors is among the key information strategy goals and the very success of this strategy relies on how thorough and accurate the customers’ data are and how efficiently online vendors can react upon these data in order to provide value propositions based on the data-driven insights (Salonen and Karjaluoto 2016). This approach, however, requires fostering more customer-centric business culture and prioritizing the alignment of the company’s information and analytic strategy, and business capabilities with the customers’ needs (Simon et al. 2016).

To stay competitive, e-Commerce players have been developing and adopting differentiation strategies in order to attract and retain customers. A common differentiation strategic approach among online retailers is a web content personalization aimed at matching customers’ needs and expectations (Tam and Ho 2006).

Various marketing studies adopt different definitions of personalization. The general concept of personalization, however, refers to the process where products and services are adjusted to match individual preferences by learning customer needs and transforming the gathered data into recommendations, offers, and promotions aimed at forming competitive advantage for the focal company by creation of idiosyncratic value (Montgomery and Smith 2009; Vesanen and Raulas 2006; Salonen and Karjaluoto 2016). In the e-Commerce context, personalization means the use of adaptive web content to convey personally relevant data to the consumer (Karat et al. 2003).

Content personalization is generally assumed to be among the most efficient information strategy approaches aimed to maximize immediate and future business opportunities for online merchants by delivering the right content to the right person at the right moment (Cao and Li 2007; Ho and Bodoff 2014; Aguirre et al. 2015). Adolphs and Winkelmann (2008) argue that content personalization is a powerful tool which can help build strong customer relationships as it allows addressing customers personally. In his research Vesanen (2007) makes a conclusion that the ability of online merchants to provide personalized services more accurately, enables their customers to enjoy a better preference match, experience, communication, reduced cognitive overload, and convenience (Ansari and Mela 2003) which eventually increases customer satisfaction and loyalty (Rust and Chung 2006).

However, as the field grows, there is a plethora of important topics around content personalization within e-Commerce field that lacks substantial clarification what creates ambiguity around this domain (Salonen and Karjaluoto 2016). One of the points, which currently remains understudied, focuses on the role of users’ emotions for content personalization in the context of online shopping (Pappas et al. 2017).

Although the concept of personalized services has been known to marketers since 1987 (Surprenant and Solomon 1987), it was not largely embraced until the idea of customer-centric approach within e-Commerce attracted the attention of information systems and human-computer interaction researchers (Li and Karahanna 2015). Today web content personalization is a focus research area for many scientific fields, especially within marketing and information systems, and since web content personalization involves human-computer interaction, it is being largely investigated in relation to technological advancements and applications (Salonen and Karjaluoto 2016). Essentially, implementation of web content personalization implies extensive use of technologies, which makes such topics as recommender systems, user profiling principles, contextual data, and user emotions recognition (Adolphs and Winkelmann 2008) key research areas within multidisciplinary fields, where technological research models are supplemented with approaches from marketing and psychology studies (Salonen and Karjaluoto 2016).

The purpose of this research is to analyze the stream of studies on the role of user emotions for content personalization in e-Commerce. Specifically, the main goal of the paper is to extract the data about why emotions play an important role in online shopping and which technologies and approaches exist to detect and capture user emotions and make a comprehensive overview of these findings. We fill the research gap by synthesizing the extensive studies on the role of user emotions for personalized services within e-Commerce and provide an aggregated comprehensive concept-matrix. The proposed concept-matrix incorporates market-available emotions recognition technologies mapped against the specific emotions these technologies are able to detect and process as well as industry domains where these solutions are currently being applied.

This paper is organized as follows: Sect. 2 presents the methodology we applied in this research. Section 3 introduces the background of this paper regarding the role of user emotions in content personalization within e-Commerce field. In Sect. 4 we present a table which provides a clear snapshot of the strategic approach applied to extract user emotions as well as data acquisition methods and the list of key research contributors. Furthermore, we identify the comprehensive list of technologies which are currently available on the market and used to capture user emotions via facial expressions and voice recognition. We developed a concept matrix based on the methodology proposed by Webster and Watson (2012) to synthesize the technologies we identified with the user emotions these technologies are able to capture as well as the application fields of the selected solutions. In Sect. 5 we discuss the results, conclude the paper, and propose future research directions.

2 Methodology

The state-of-the-art review of prior relevant literature is an essential approach aimed at facilitation of further academic research through closing the gaps of areas where a plethora of studies already exists (Webster and Watson 2002). The goal of the state-of-the-art literature review is to uncover the directions where the research is needed by providing points of reflections on the present location and potential future directions of the subject matter (Salonen and Karjaluoto 2016). The current literature review is aimed at providing a comprehensive overview of the role of emotions within e-Commerce giving special attention to the significance of emotional variable in building content personalization strategies as well as to extensive theoretical background.

2.1 Literature Search

The current study is built upon the stages of the concept-centric approach defined by Webster and Watson (2002) in order to determine the most relevant sources for the literature review. The method suggests commencing the review from identifying a preliminary pool of scientific papers which will be further elaborated to extract the most relevant materials to build the review upon. The resources relevant for building the review were selected with the clear research focus after reviewing a wide variety of the respected and well-established academic and scientific materials. The primary data sources that we leveraged were peer-reviewed academic journals which fall into the set of “AIS Basket of 8”, databases such as Google Scholar, PubsOnLine, EBSCOhost (Academic Search Complete), and ISI Web of Knowledge/ISI Web of Science as well as subject-specific database such as ACM Digital Library.

2.2 Keywords, Inclusion and Exclusion Criteria

The main requirement for the article to be considered for further revision was presence of the certain keywords. We were mostly focused at finding the papers which would contain the following keywords in their titles: “Personali*” (both z- and s-forms of the terms are used in the literature, so the search of the mentioned keywords we did in combination with “AND”), “Recommender system”, “Customer-centric”, “Customer AND user emotions”, “Emotions recognition AND identification AND detection”, “Behavi*” (both options “behavior” and “behaviour” can be found in the literature, so these keywords were also sought in combination with “AND”) which gave us 300 papers at the end.

In order to retrieve the most relevant information, we applied the range of exclusion criteria. The first criteria concerned the review period, which included studies published between 2005 and 2018. This allowed us to elaborate on the broad range of literature, without, however, considering those studies which might have become somewhat obsolete to date. We also made a decision to focus the review solely on the concept of personalization omitting the concept of customization in order to have a clear and grounded literature review at the end. After applying the mentioned filters, we screened through the paper abstracts in order to identify how relevant the paper was to be considered for further elaboration. The scientific articles, focused on technical aspects of recommender systems or recommender algorithms were also elaborated, however, we considered them as less important for the current study. At the end of this step, we ended up with having 80 papers.

The collected papers were read with the clear research focus. At this stage, we were also applying backward referencing to examine the works cited in some of the most relevant papers as well as forward referencing to make a deep search across the papers which cited the papers we considered relevant. We finished this step with having 45 papers which we further read in details.

The remained pool was comprised of 27 papers which represented the most relevant materials as they had a clear focus on the role of user emotions in personalized services within e-Commerce. These papers became the foundation for building the current literature review. Figure 1 demonstrates the process of articles selection we applied in this review.

Fig. 1.
figure 1

Articles selection process.

3 Background

In their study Aguirre et al. (2015) claim that since personalization is all about adapting to customer preferences and needs, e-Commerce players need to be fully aware of how to collect high-quality primary data about their customers. This is made possible by deploying intelligent personalization algorithms capable of collecting and analyzing web activities to generate highly adaptive, dynamic, and personalized content for individual users (Mobasher et al. 2000). Data collection and processing involves different methods of harvesting data aimed at building customer profiles and adapting it to be used for web personalization mostly by recommender systems (Salonen and Karjaluoto 2016).

3.1 Recommender Systems as a Content Personalization Tool

Today recommender systems are widely implemented solutions and represent fully functional software systems that apply at least one “recommendation approach”. This approach is specifically designed to predict the needs of online customers and recommend them the online purchase of some products based on the collected data (Bǎdicǎ et al. 2011; Beel et al. 2016). According to Beel et al. (2016) a “recommendation approach” is a model which describes how to bring a recommendation concept into the practice. The mostly adopted recommendation concepts within modern recommender systems are collaborative filtering and content-based filtering which have fundamental differences in their underlying ideas (Beel et al. 2016).

  • Collaborative filtering. Collaborative filtering was one of the first personalization technologies which became widely available within e-Commerce domain (Montgomery and Smith 2009). Collaborative filtering does not require the user’s explicit profile (Koren 2010) and, in the online retail context, collaborative filtering generates recommendations by predicting the utility of particular items to a specific user based on user votes retrieved from a user database. In order to generate more specific suggestions of products or services, items or services need to be rated by many customers (Breese et al. 1998).

  • Content-based filtering. Content-based filtering analyses the content of information sources and creates a user profile based on the customer’s interests (past searches, item rating or preferences about specific goods) in terms of regularities of the items and services that have been rated highly (Pazzani 1999). Content-based filtering then relies on a user profile and recommends those items which match with the customer’s needs and preferences reflected in the customer profile (Uçar and Karahoca 2015).

  • Hybrid recommender systems. The previously introduced approaches may be combined in hybrid systems to complement and eliminate drawbacks of each other (Uçar and Karahoca 2015). For instance, collaborative filtering is based upon ratings on items made by other customers, what makes implementation of the new recommender system challenging as these data are not yet available. Leveraging of content-based filtering on top of collaborative filtering will help make a deep analysis of user profiles what will eliminate the generation of immature suggestions by collaborative filtering-based recommender system in the early stages (Tran and Cohen 2000).

All the approaches described focus on recommending items to customers based on rigidly established algorithms and operate only within a user-item matrix when new data are simply being compounded on top of the current user’s profile (Lombardi et al. 2009). Costa and Macedo (2013) argue that although multiple approaches and dimensions exist which make performance of recommender systems more efficient, most of the existing recommender systems do not consider human emotions when recommending content. In his research, Polignano (2015) also concludes that traditional recommender systems mostly do not address emotions in the computational process and only recent papers have some explanation on why emotions are highly important in providing personalized services and how it is possible to recognize emotion with the help of technologies.

3.2 Emotions as a Contextual Variable in Content Personalization

Although the role of user emotions has been widely acknowledged in multiple studies, the amount of studies on the importance of emotions for content personalization within e-Commerce is relatively low. This might be explained by the lack of a commonly accepted definition of what constitutes emotions. However, scholars in e-Commerce domain have elaborated on the range of emotions and prioritized the following categories: social/personality issues (the effect of emotions on consumption judgment and reasoning), cognitive factors (cognitive and social construction in emotions), and the correlation of emotions with other consumption factors (Gaur et al. 2014).

To attain theoretical clarity a precise clarification of what is the difference between emotions and feelings is necessary. In his paper Walla (2017) makes an attempt to distinguish between emotions and feelings and provide clear definitions of both notions. According to Walla (2017) feelings represent “consciously felt bodily responses” which can appear from subcortical affective processing whereas emotions are behavioral output based on affective processing, which can help communicate but not necessarily fully reflect feelings of the person to others.

In the past decade, several context-aware recommender algorithms have been implemented and proved to be useful to enhance recommendation performance in numerous domains by additionally incorporating context information (Gorgoglione et al. 2011). The most widely adopted context factors are the time of the day, the day of the week, and location, which, according to Zheng (2016) can be relatively easy captured from the ubiquitous environment. Recommender systems have traditionally applied data-centric computational approaches for content recommendation and user modeling (Tkalčič et al. 2011). However, relatively few studies have been done about why it is important to take into consideration user emotions when designing personalized services (Zheng et al. 2013).

Research by González et al. (2007) supports that purchasing decisions are always being accompanied by emotions. This makes customer-centric companies put efforts into detecting the existing connections between users’ actions and their emotions to have a better understanding of how to increase retention, loyalty, and satisfaction rates among customers through providing enhanced personalized services (González et al. 2007). In their research Zheng et al. (2013) proved that the increased precision in providing relevant suggestions by recommender systems is achieved when an emotional variable is considered. In their study Tkalčič et al. (2013) conducted a comparative analysis of the generic metadata- and affective metadata-based recommender systems. The authors designed a methodology and conducted an experiment which proved that recommender system which was capable of utilizing emotional context, demonstrated improved performance over recommender system based only on the generic metadata.

4 Results

4.1 Strategies to Identify User Emotions

According to Zheng (2016), emotional states are usually dynamic and tend to change from time to time. Therefore, the users’ profiles associated with different affective reactions can be of high importance to assist recommender systems in making relevant content suggestions (Zheng 2016) what, in turn, can facilitate the decision-making process of users (Picard et al. 2004). In their study González et al. (2007) describe emotional factor as “the relevance that each user gives to differential values of items (i.e. events, services, products), which are demonstrated in the user’s decision-making process by means of his or her actions”. González et al. (2007) also argue that emotional factor particularly influences the rational thinking when users get any recommendations.

The important role of emotions in recommender systems has been discussed by Tkalčič et al. (2011) where the authors demonstrate how to identify emotions during the interaction with recommender system in three data acquisition stages in the consumption chain of an item: entry stage (i.e. before the activity), the consumption stage (i.e. during the activity), and exit stage (i.e. after the activity).

In Table 1 we summarize the key data about each stage and also provide the most prominent studies made to address the peculiarities of data acquisition methods within each stage. The way we compiled and synthesized the data with the help of the table gives the reader a chance to quickly capture the key points of emotions identification strategy proposed by Tkalčič et al. (2011). Furthermore, we are also presenting the list of key contributors who put significant efforts into exploring each strategic step separately to discover new and important insights.

Table 1. Description of emotions identification stages, data acquisition methods, and key research contributors.

According to Tkalčič et al. (2011), emotional states of the users can be identified by applying explicit and implicit approaches. Explicit acquisition of data about user emotions is usually carried out by applying the Self-Assessment Manikin developed by Bradley and Lang in 1994. This approach represents a non-verbal pictorial assessment technique designed as a paper-and-pencil questionnaire which is used in groups to evaluate the pleasure, arousal, and dominance associated with a person’s emotional reaction in response to an object or event (Bradley and Lang 1994). The implicit detection of emotions, although being intrusive in nature, proved to be a more accurate emotions detection approach rather than an explicit one since users under surveillance are not aware of their emotions being captured what reduce the likelihood of data distortion (Tkalčič et al. 2011).

The most widely adopted emotion categories include happiness, sadness, fear, anger, disgust, and surprise. The description of these fundamental emotions was thoroughly investigated and categorized into the universal emotions model by the cross-cultural studies conducted by Ekman and Rosenberg (1993), indicating that humans perceive certain basic emotions with respect to facial expressions in the same way, regardless of their culture (Zeng et al. 2009). However, since the universal model provides a limited set of distinct emotional categories, the circumplex multidimensional VAD Model of affect was proposed by Posner et al. (2005). The proposed model describes each emotion as a dynamic concept within multidimensional space (Tkalčič et al. 2011).

The VAD Model describes emotions using 3 dimensions: Valence (V) that evaluates how positive or pleasant an emotion is, ranging from negative to positive; Arousal (A) which assesses the involvement level of the person, ranging from “non-active”, “in calm to agitated”, “ready to act” and Dominance (D) that measures the level of control over the situation by the person, ranging from “submissive”, “non-control” “dominant/in-control” (Kosti et al. 2017).

Despite the fact that most of the emotional expressions occur in a realistic interpersonal or human-computer interactions are non-basic emotions (Cowie et al. 2005) the majority of the existing technologies and approaches developed to date are mostly designed to identify the six basic human emotions (Kosti et al. 2017).

4.2 Emotions Recognition from Facial Expressions

As people use facial expressions as a natural mean of emotional communication, one of the most well-studied channels of emotional expressions is facial expressions recognition. The analysis of facial expressions is a relevant and important topic from both scientific and applied standpoints (Soleymani et al. 2012). However, in his paper, Walla (2017) states that eliciting respective responses in the observer’s brain by observing someone’s facial expressions might be challenging since faces do not necessarily reflect deep internal affective states of the person what might become a confounder for emotions recognition approaches.

The topic of understanding internal decision-making drivers by recognizing facial expressions, however, keeps attracting the interests of information systems and human-computer interactions researchers as well as business-oriented professionals what resulted in rapid development of multiple technology companies (Garcia-Garcia et al. 2017). We briefly present the six technologies capable of recognizing emotions from facial expressions which are available on the market today.

  • nViso. Developed by the Swiss company, this technology can detect six basic emotional groups, described by Ekman and Rosenberg (1993) by using a proprietary deep learning algorithm. According to the data provided on the website, nViso can capture emotions of one person or a group of people in real time with the help of the webcam, which tracks muscle movements of the face (nViso 2018). nViso partnered with IBM to create an emotional intelligence cloud solution capable of analyzing facial expressions to enable financial advisors better understand their clients’ financial needs (IBM 2018). Furthermore, together with ePAT Technologies, nViso actively works on a smartphone-based medical device which is able to assess pain levels in real time by analyzing facial muscle movements of patients (IBM 2018; ePAT Technologies Ltd. 2017).

  • Affectiva. By analyzing emotions from the twenty facial zones retrieved from a database of videos and images (Affectiva 2018a), this emotion recognition software can detect seven emotions (anger, contempt, disgust, fear, happiness, sadness and surprise) as well as measure valence and arousal of the person (Affectiva 2018b). Affectiva has recently partnered with Voxpopme, which is a global emotion recognition software provider, to work on the platform to enable advanced analysis of facial expressions within video feedback (Busines Wire 2017). Furthermore, Affectiva helps clients develop analytics solutions in multiple domains, including healthcare, education, media and advertising, retail, and gaming (Affectiva 2018a).

  • EmoVu by Eyeris. This software solution exploits the deep learning algorithms which retrieve data from large datasets about people of various ages, ethnicities, genders, etc. EmoVu can recognize such emotions as anger, disgust, fear, happiness, neutral, sadness, and surprise and can also measure the degree of arousal and valence (Eyeris 2018a). Eyeris is mostly specialized in the development of facial analytics and emotion recognition technology for the automotive sector and the most prominent Eyeris’s customers include Toyota Motor Corporation and Honda Motor Co. (Eyeris 2017). Furthermore, Eyeris has recently partnered with AvatarMind, the creator of iPal® Robot, a humanoid robot which serves as a social companion, educator, and safety monitor for children and elderly (Eyeris 2018b).

  • Kairos. This technology provides data about persons’ six emotions, level of attention, and sentiment based on the analyzed videos or images. Furthermore, the services provided by Kairos include age, ethnicity, and gender identification as well as group faces recognition and detection (Kairos 2018b). The emotion recognition software, provided by Kairos was implemented by companies such as The Interpublic Group of Companies, Legendary Entertainment, PepsiCo, etc. operating in multiple domains including advertising and media, retail, and banking and insurance (Kairos 2018a).

  • Microsoft Cognitive Services. The Cognitive Services Pack provided by Microsoft can identify the face and emotional expressions of people after processing pictures and videos. This software identifies six basic emotional groups described by Ekman and Rosenberg (1993) as well as contempt and neutrality (Microsoft Microsoft 2018a). Microsoft provides its Cognitive Services to the businesses involved in manufacturing, healthcare, media and telecommunications, education, banking and insurance, retail, etc. The featured clients are ABB Group, Daimler AG, Allergan, Telefonica, etc. (Microsoft 2018b).

  • FaceReader by Noldus. This automatic recognition software can analyze up to 500 facial points to recognize such emotions as neutral, contempt, boredom, interest, and confusion. Furthermore, FaceReader also calculates gaze direction, head orientation, and person characteristics (Noldus 2018a). The Noldus’s clients mostly involved in healthcare, retail, and education services and include such companies as Pfizer, GlaxoSmithKline, Carnegie Mellon University, University of Maryland, Johnson & Johnson, etc. (Noldus 2018b).

4.3 Emotions Recognition from Speech

Human speech is another essential channel for gathering emotional data since accurate and real-time understanding of human-speech can significantly facilitate human-computer interactions (Tao et al. 2018) and therefore improve the work of personalized services and technologies within online shopping. In the last few years, emotions recognition from speech has become of particular scientific interest which resulted in multiple studies conducted by scientists (El Ayadi et al. 2011) which in turn became a solid ground for rapid technological development. We make a brief overview of five technologies which utilize speech to elicit emotions and are available on the market.

  • Vokaturi. The Amsterdam-based company developed a solution, which can measure whether people are happy, sad, afraid, angry, or have a neutral state of mind directly from their voice. Vokaturi has been validated with the multiple existing emotion databases and works in a language-independent manner (Vokaturi 2018). Vokaturi has recently established a partnership with Affectiva with the purpose of joining efforts to work on the emotion-sensing product into the autonomous vehicles sector (Affectiva 2018).

  • Good Vibrations Company B.V. This solution can recognize the emotions of the person by processing recorded voice. Good Vibrations measures acoustic properties of the user’s voice and performs a real-time analysis of the user’s emotions to recognize stress, pleasure, and arousal (Good Vibrations Company B.V. 2018). According to the official website of Good Vibrations Company B.V., the areas where the developed solution has the greatest potential include healthcare, advertising, gaming, sports, business, robotics, safety, and matching (Good Vibrations Company B.V. 2018).

  • audEERING. This Munich-based tech company developed intelligent audio analysis algorithms to help organizations integrate audio analysis technology into their products. The embedded automated paralinguistic speech analysis allows detecting a multitude of attributes from the human voice, such as emotions and affective states (valence, arousal, dominance), age, alertness, or personality (audEERING 2018a). audEERING’c clients represent multiple domains such as manufacturing, telecommunications, education, retail and include BMW, Daimler, T-Mobile, Deutsche Welle, Huawei, etc. (audEERING 2018b).

  • Beyond Verbal. The solution, provided by this Israel-based company, is capable of extracting multiple acoustic features from a speaker’s voice in real time and providing insights into the emotional, health, and wellbeing condition of the user. By utilizing voice-driven emotions analytics, the technology can recognize anger, sadness, neutral, and happiness as well as can measure valence, arousal, and temper in the voice of the speaker. Beyond Verbal’s clients mostly represent companies from retail, and media and marketing sectors and include such companies as Amdoc, FRONTLINE Selling, and Department26 (Beyond Verbal 2018).

  • Nemesysco. The company provides advanced voice analysis technologies for emotion detection, personality, and risk assessment. The technology is based on the proprietary signal-processing algorithms which can extract over 150 acoustic parameters from voice and classify the collected properties into major emotional groups including anger, happiness, satisfaction, and arousal. The key domains currently covered by Nemesysco are retail, banking and insurance, and security. Among the key customers we featured Nestle, Allianz, and Europ Assistance (Nemesysco 2018).

Table 2 aggregates the technologies described and maps them against emotional dimensions of human beings we have described in the Sect. 4.1. Furthermore, the concept-matrix also demonstrates in which domains the listed technologies are currently in use. The data provided in this table are retrieved from the official websites and contains only information which has been officially published by the software vendors.

Table 2. Emotions recognition technologies, targeted emotions, and technologies application fields.

5 Discussion

People’s emotions play an essential role in many aspects of life, especially in decision-making processes. When it comes to user-centric approaches in e-Commerce, enriching the capabilities of recommender systems with the emotions recognition functionality is a vital technical and strategic decision which can amplify generation of useful and relevant content suggestions. There is an obvious shift towards embracement of emotional variable into personalized services, however, technical capabilities of recommender systems available today are still quite limited in terms of emotions recognition and utilization for content personalization.

Multiple studies prove that technologies and recommender systems capable of recognizing and interpreting emotions in the same way as other human beings have tremendous potential in human-computer interaction, human-assistive technologies, and e-Commerce. According to Walla (2017), however, the scientific community currently does not have a unified definition of what is an emotion as some of the scholars understand emotions as neural activities, others describe them as felt affective phenomena, and some scientists see emotions as facial expressions. Furthermore, in his paper Walla (2017) argues that affective processing does not always result in observable and measurable emotions. This might have serious implications for marketing and IS experts since emotions to be elicited and measured might not fully reflect the underlying affective processing. This, in turn, might mislead the interpretation of outcomes and result in irrelevant products and services that nobody wants or needs (Walla 2017).

According to Kreibig (2010), collecting reliable and valid data on autonomic responding in emotions has long been and remains a challenge within emotions research. To gain a deeper understanding of the functional role of emotions, future researchers need to thoroughly investigate and verify the particular type of emotions elicited (Kreibig 2010). This needs to be done in order to get a deeper understanding of how emotions, as well as their variations, reflect a specific state of affective processing of a particular individual. A comprehensive approach aimed at retrieving user emotions by deep analysis of affective processing may be created to facilitate understanding of users’ tasks and goals, internal decision drivers, contextual properties and characteristics. This scientific method can be of great significance for providing insights into how to make personalized services within e-Commerce more precise, effective, and efficient.

The deep knowledge of the correlation between user emotions utilization and development of content personalization systems would lead to significant improvement in creation of specific personalization features and technologies. Furthermore, a better understanding of impact of emotional variable on decision-making during online shopping would facilitate designing of recommender systems architectures to enable online merchants to provide a better customer experience. A deep understanding of how user emotions impact and shape decision-making process could give rise to a new generation of personalization technologies providing value both to the users and the organizations.

In the future, we will provide a systematic study, which would consider a higher number of studies, including the most recent state-of-the art research in order to provide even more valuable insights into how user emotions are being captured and processed by emerging technologies. As a further step, our next research will go deeper into the concept of context-aware recommender systems and affective computing to identify which scientific methods and approaches are being created to advance development of personalization-focused technologies.