Skip to main content
Log in

State of the art: a review of sentiment analysis based on sequential transfer learning

  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

Recently, sequential transfer learning emerged as a modern technique for applying the “pretrain then fine-tune” paradigm to leverage existing knowledge to improve the performance of various downstream NLP tasks, with no exception of sentiment analysis. Previous pieces of literature mostly focus on reviewing the application of various deep learning models to sentiment analysis. However, supervised deep learning methods are known to be data hungry, but insufficient training data in practice may cause the application to be impractical. To this end, sequential transfer learning provided a solution to alleviate the training bottleneck issues of data scarcity and facilitate sentiment analysis application. This study aims to discuss the background of sequential transfer learning, review the evolution of pretrained models, extend the literature with the application of sequential transfer learning to different sentiment analysis tasks (aspect-based sentiment analysis, multimodal sentiment analysis, sarcasm detection, cross-domain sentiment classification, multilingual sentiment analysis, emotion detection) and suggest future research directions on model compression, effective knowledge adaptation techniques, neutrality detection and ambivalence handling tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Abdu SA, Yousef AH, Salem A (2021) Multimodal video sentiment analysis using deep learning approaches, a survey. Inf Fusion 76:204–226

    Article  Google Scholar 

  • Acheampong FA, Nunoo-Mensah H, Chen W (2021) Transformer models for text-based emotion detection: a review of bert-based approaches. Artif Intell Rev 54:1–41

    Article  Google Scholar 

  • Adoma AF, Henry NM, Chen W (2020) Comparative analyses of bert, roberta, distilbert, and xlnet for text-based emotion recognition. In: 2020 17th international computer conference on wavelet active media technology and information processing (ICCWAMTIP). IEEE, pp 117–121

  • Agüero-Torales MM, Salas JIA, López-Herrera AG (2021) Deep learning and multilingual sentiment analysis on social media data: an overview. Appl Soft Comput 107:107373

    Article  Google Scholar 

  • Al-Moslmi T, Omar N, Abdullah S, Albared M (2017) Approaches to cross-domain sentiment analysis: a systematic literature review. IEEE Access 5:16173–16192

    Article  Google Scholar 

  • Alberti C, Ling J, Collins M, Reitter D (2019) Fusion of detected objects in text for visual question answering. arXiv:1908.05054

  • Alhuzali H, Ananiadou S (2021) Spanemo: casting multi-label emotion classification as span-prediction. arXiv:2101.10038

  • Avvaru A, Vobilisetty S, Mamidi,R (2020) Detecting sarcasm in conversation context using transformer-based models. In: Proceedings of the second workshop on figurative language processing, pp 98–103

  • Babanejad N, Davoudi H, An A, Papagelis M (2020) Affective and contextual embedding for sarcasm detection. In: Proceedings of the 28th international conference on computational linguistics, pp 225–243

  • Bataa E, Wu J (2019) An investigation of transfer learning-based sentiment analysis in Japanese. arXiv:1905.09642

  • Beltagy I, Lo K, Cohan A (2019) Scibert: a pretrained language model for scientific text. arXiv:1903.10676

  • Bengio Y, Ducharme R, Vincent P (2000) A neural probabilistic language model. Adv Neural Inf Process Syst. 13

  • Birjali M, Kasri M, Beni-Hssane A (2021) A comprehensive survey on sentiment analysis: approaches, challenges and trends. Knowl-Based Syst 226:107134

    Article  Google Scholar 

  • Blitzer J, McDonald R, Pereira F (2006) Domain adaptation with structural correspondence learning. In: Proceedings of the 2006 conference on empirical methods in natural language processing, pp 120–128

  • Bojanowski P, Grave E, Joulin A, Mikolov T (2017) Enriching word vectors with subword information. Trans Assoc Comput Linguist 5:135–146

    Article  Google Scholar 

  • Cai Y, Cai H, Wan X (2019) Multi-modal sarcasm detection in twitter with hierarchical fusion model. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 2506–2515

  • Cambria E, Das D, Bandyopadhyay S, Feraco A (2017) Affective computing and sentiment analysis. A practical guide to sentiment analysis. Springer, New York, pp 1–10

    Chapter  Google Scholar 

  • Cambria E, Li Y, Xing FZ, Poria S, Kwok K (2020) Senticnet 6: ensemble application of symbolic and subsymbolic ai for sentiment analysis. In: Proceedings of the 29th ACM international conference on information & knowledge management, pp 105–114

  • Cao Z, Zhou Y, Yang A, Peng S (2021) Deep transfer learning mechanism for fine-grained cross-domain sentiment classification. Connect Sci 33(4):911–928

    Article  Google Scholar 

  • Castro S, Hazarika D, Pérez-Rosas V, Zimmermann R, Mihalcea R, Poria S (2019) Towards multimodal sarcasm detection (an _obviously_ perfect paper). arXiv:1906.01815

  • Chandrasekaran G, Nguyen TN, Hemanth DJ (2021) Multimodal sentimental analysis for social media applications: a comprehensive review. Wiley Interdiscip Rev 11(5):e1415

    Google Scholar 

  • Chaturvedi I, Cambria E, Welsch RE, Herrera F (2018) Distinguishing between facts and opinions for sentiment analysis: survey and challenges. Inf Fusion 44:65–77

    Article  Google Scholar 

  • Chen F, Huang Y (2019) Knowledge-enhanced neural networks for sentiment analysis of Chinese reviews. Neurocomputing 368:51–58

    Article  Google Scholar 

  • Chen X, Sun Y, Athiwaratkun B, Cardie C, Weinberger K (2018) Adversarial deep averaging networks for cross-lingual sentiment classification. Trans Assoc Comput Linguist 6:557–570

  • Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa P (2011) Natural language processing (almost) from scratch. J Mach Learn Res 12:2493–2537

    MATH  Google Scholar 

  • Conneau A, Kruszewski G, Lample G, Barrault L, Baroni M (2018) What you can cram into a single vector: probing sentence embeddings for linguistic properties. arXiv:1805.01070

  • Conneau A, Lample G (2019) Cross-lingual language model pretraining. In: Advances in neural information processing systems , 32

  • Davidov D, Tsur O, Rappoport A (2010) Semi-supervised recognition of sarcasm in twitter and amazon. In: Proceedings of the fourteenth conference on computational natural language learning, pp 107–116

  • Devlin J, Chang MW, Lee K, Toutanova K (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805

  • Do HH, Prasad P, Maag A, Alsadoon A (2019) Deep learning for aspect-based sentiment analysis: a comparative review. Expert Syst Appl 118:272–299

    Article  Google Scholar 

  • Dong XL, de Melo G (2019) A robust self-learning framework for cross-lingual text classification. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp 6306–6310

  • Du C, Sun H, Wang J, Qi Q, Liao J (2020a) Adversarial and domain-aware bert for cross-domain sentiment analysis. In: Proceedings of the 58th annual meeting of the Association for Computational Linguistics, pp 4019–4028

  • Du Y, He M, Wang L, Zhang H (2020) Wasserstein based transfer network for cross-domain sentiment classification. Knowl-Based Syst 204:106162

    Article  Google Scholar 

  • Duh K, Fujino A, Nagata M (2011) Is machine translation ripe for cross-lingual sentiment classification? In: Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies, pp 429–433

  • Elman JL (1990) Finding structure in time. Cogn Sci 14(2):179–211

    Article  Google Scholar 

  • Feng Y, Wan X (2019) Learning bilingual sentiment-specific word embeddings without cross-lingual supervision. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 420–429

  • Fu Y, Liu Y (2021) Cross-domain sentiment classification based on key pivot and non-pivot extraction. Knowl-Based Syst 228:107280

    Article  Google Scholar 

  • Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V (2016) Domain-adversarial training of neural networks. J Mach Learn Res 17(1):2030–2096

    MathSciNet  MATH  Google Scholar 

  • Ghosal D, Majumder N, Poria S, Chhaya N, Gelbukh A (2019) Dialoguegcn: A graph convolutional neural network for emotion recognition in conversation. arXiv:1908.11540

  • Gregory H, Li S, Mohammadi P, Tarn N, Draelos R, Rudin C (2020) A transformer approach to contextual sarcasm detection in twitter. In: Proceedings of the second workshop on figurative language processing, pp 270–275

  • Habimana O, Li Y, Li R, Gu X, Yu G (2020) Sentiment analysis using deep learning approaches: an overview. Sci China Inf Sci 63(1):1–36

    Article  Google Scholar 

  • Hao Y, Dong L, Wei F, Xu K (2019) Visualizing and understanding the effectiveness of BERT. https://arxiv.org/abs/1908.05620v1.

  • Harris ZS (1954) Distributional structure. Word 10(2–3):146–162

    Article  Google Scholar 

  • Hazarika D, Zimmermann R, Poria S (2020) Misa: modality-invariant and-specific representations for multimodal sentiment analysis. In: Proceedings of the 28th ACM international conference on multimedia, pp 1122–1131

  • He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

  • Hinton G, Vinyals O, Dean J, et al (2015) Distilling the knowledge in a neural network. arXiv:1503.02531

  • Howard J, Ruder S (2018). Universal language model fine-tuning for text classification. arXiv:1801.06146

  • Huang C, Trabelsi A, Qin X, Farruque N, Mou L, Zaiane OR (2021) Seq2emo: a sequence to multi-label emotion classification model. In: Proceedings of the 2021 conference of the North American chapter of the association for computational linguistics: human language technologies, pp 4717–4724

  • Huang YH, Lee SR, Ma MY, Chen YH, Yu YW, Chen YS (2019) Emotionx-idea: emotion bert–an affectional model for conversation. arXiv:1908.06264

  • Jaiswal N (2020). Neural sarcasm detection using conversation context. In: Proceedings of the second workshop on figurative language processing, pp 77–82

  • Javdan S, Minaei-Bidgoli B, et al (2020) Applying transformers and aspect-based sentiment analysis approaches on sarcasm detection. In: Proceedings of the second workshop on figurative language processing, pp 67–71

  • Jawahar G, Sagot B, Seddah D (2019) What does bert learn about the structure of language? In: ACL 2019-57th annual meeting of the association for computational linguistics

  • Jorgensen J, Miller GA, Sperber D (1984) Test of the mention theory of irony. J Exp Psychol Gen 113(1):112

    Article  Google Scholar 

  • Joshi A, Sharma V, Bhattacharyya P (2015) Harnessing context incongruity for sarcasm detection. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (vol 2: Short Papers), pp 757–762

  • Karimi A, Rossi L, Prati A (2020) Improving bert performance for aspect-based sentiment analysis. arXiv:2010.11731

  • Karimi A, Rossi L, Prati A (2021) Adversarial training for aspect-based sentiment analysis with bert. In: 2020 25th international conference on pattern recognition (ICPR). IEEE, pp 8797–8803

  • Kumar A, Anand V (2020) Transformers on sarcasm detection with context. In: Proceedings of the second workshop on figurative language processing, pp 88–92

  • Lan Z, Chen M, Goodman S, Gimpel K, Sharma P, Soricut R (2019) Albert: a lite bert for self-supervised learning of language representations. arXiv:1909.11942

  • Lee J, Yoon W, Kim S, Kim D, Kim S, So CH, Kang J (2020) Biobert: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4):1234–1240

    Google Scholar 

  • Li LH, Yatskar M, Yin D, Hsieh CJ, Chang KW (2019) Visualbert: a simple and performant baseline for vision and language. arXiv:1908.03557

  • Li Q, Shah S (2017) Learning stock market sentiment lexicon and sentiment-oriented word vector from stocktwits. In: Proceedings of the 21st conference on computational natural language learning (CoNLL 2017), pp 301–310

  • Li T, Chen X, Zhang S, Dong Z, Keutzer K (2021) Cross-domain sentiment classification with contrastive learning and mutual information maximization. In: ICASSP 2021-2021 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 8203–8207

  • Li W, Shao W, Ji S, Cambria E (2022) Bieru: bidirectional emotional recurrent unit for conversational sentiment analysis. Neurocomputing 467:73–82

    Article  Google Scholar 

  • Li Z, Zhang Y, Wei Y, Wu Y, Yang Q (2017) End-to-end adversarial memory network for cross-domain sentiment classification. In: IJCAI, pp 2237–2243

  • Liang B, Luo W, Li X, Gui L, Yang M, Yu X, Xu R (2021) Enhancing aspect-based sentiment analysis with supervised contrastive learning. In: Proceedings of the 30th ACM international conference on information & knowledge management, pp 3242–3247

  • Liang B, Su H, Gui L, Cambria E, Xu R (2022) Aspect-based sentiment analysis via affective knowledge enhanced graph convolutional networks. Knowl-Based Syst 235:107643

    Article  Google Scholar 

  • Liao W, Zeng B, Yin X, Wei P (2021) An improved aspect-category sentiment analysis model for text sentiment analysis based on roberta. Appl Intell 51(6):3522–3533

    Article  Google Scholar 

  • Liu B (2020) Sentiment analysis: mining opinions, sentiments, and emotions. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Liu R, Shi Y, Ji C, Jia M (2019) A survey of sentiment analysis based on transfer learning. IEEE Access 7:85401–85412

    Article  Google Scholar 

  • Liu Y, Ott M, Goyal N, Du J, Joshi M, Chen D, Levy O, Lewis M, Zettlemoyer L, Stoyanov V (2019b) Roberta: a robustly optimized bert pretraining approach. arXiv:1907.11692

  • Liu Z, Shen Y. Lakshminarasimhan VB, Liang PP, Zadeh A, Morency LP (2018) Efficient low-rank multimodal fusion with modality-specific factors. arXiv:1806.00064

  • Lu J, Batra D, Parikh D, Lee S (2019) Vilbert: pretraining task-agnostic visiolinguistic representations for vision-and-language tasks. In: Advances in neural information processing systems, 32

  • Luo L, Wang Y (2019) Emotionx-hsu: adopting pre-trained bert for emotion classification. arXiv:1907.09669

  • Ma Y, Peng H, Cambria E (2018) Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive lSTM. In: Proceedings of the AAAI conference on artificial intelligence, vol 32

  • Majumder N, Poria S, Hazarika D, Mihalcea R, Gelbukh A, Cambria E (2019) Dialoguernn: an attentive rnn for emotion detection in conversations. Proc AAAI Conf Artif Intell 33:6818–6825

    Google Scholar 

  • Mao HH (2020) A survey on self-supervised pre-training for sequential transfer learning in neural networks. arXiv:2007.00800

  • Maynard DG, Greenwood MA (2014) Who cares about sarcastic tweets? Investigating the impact of sarcasm on sentiment analysis. In: Proceedings of Lrec 2014. ELRA

  • McCann B, Bradbury J, Xiong C, Socher R (2017) Learned in translation: Contextualized word vectors. In: Advances in neural information processing systems, 30

  • Mikolov T, Chen K, Corrado G, Dean J (2013a) Efficient estimation of word representations in vector space. arXiv:1301.3781

  • Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013b) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, 26

  • Munikar M, Shakya S, Shrestha A (2019) Fine-grained sentiment classification using bert. In: 2019 artificial intelligence for transforming business and society (AITB), vol 1. IEEE, pp 1–5

  • Nazir A, Rao Y, Wu L, Sun L (2020) Issues and challenges of aspect-based sentiment analysis: a comprehensive survey. In: IEEE transactions on affective computing

  • Pan H, Lin Z, Fu P, Qi Y, Wang W (2020) Modeling intra and inter-modality incongruity for multi-modal sarcasm detection. In: Proceedings of the 2020 conference on empirical methods in natural language processing: findings, pp 1383–1392

  • Pan SJ, Ni X, Sun JT, Yang Q, Chen Z (2010) Cross-domain sentiment classification via spectral feature alignment. In: Proceedings of the 19th international conference on World wide web, pp 751–760

  • Pan SJ, Yang Q (2009) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359

    Article  Google Scholar 

  • Pennington J, Socher R, Manning CD (2014) Glove: global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1532–1543

  • Phan HT, Nguyen NT, Hwang D (2022) Convolutional attention neural network over graph structures for improving the performance of aspect-level sentiment analysis. In: Information Sciences

  • Poria S, Chaturvedi I, Cambria E, Bisio F (2016) Sentic lda: improving on lda with semantic similarity for aspect-based sentiment analysis. In: 2016 international joint conference on neural networks (IJCNN), pp 4465–4473. IEEE

  • Qiu X, Sun T, Xu Y, Shao Y, Dai N, Huang X (2020) Pre-trained models for natural language processing: a survey. Sci China Technol Sci 63(10):1872–1897

    Article  Google Scholar 

  • Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training

  • Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I et al (2019) Language models are unsupervised multitask learners. Open AI Blog 1(8):9

    Google Scholar 

  • Rahman W, Hasan MK, Lee S, Zadeh A, Mao C, Morency LP, Hoque E (2020) Integrating multimodal information in large pretrained transformers. In: Proceedings of the conference. Association for Computational Linguistics. Meeting, vol 2020, p 2359. NIH Public Access

  • Rietzler A, Stabinger S, Opitz P, Engl S (2019) Adapt or get left behind: Domain adaptation through bert language model finetuning for aspect-target sentiment classification. arXiv:1908.11860

  • Rosenberg EL, Ekman P (2020) What the face reveals: basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, Oxford

    Google Scholar 

  • Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536

    Article  MATH  Google Scholar 

  • Ryu M, Lee K (2020) Knowledge distillation for bert unsupervised domain adaptation. arXiv:2010.11478

  • Sanh V, Debut L, Chaumond J, Wolf T (2019) Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv:1910.01108

  • Sarzynska-Wawer J, Wawer A, Pawlak A, Szymanowska J, Stefaniak I, Jarkiewicz M, Okruszek L (2021) Detecting formal thought disorder by deep contextualized word representations. Psychiatry Res 304:114135

    Article  Google Scholar 

  • Schütze H (1992) Word space. In: Advances in neural information processing systems, 5

  • Sharma R, Bhattacharyya P, Dandapat S, Bhatt HS (2018) Identifying transferable information across domains for cross-domain sentiment classification. In: Proceedings of the 56th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 968–978

  • Shi T, Li L, Wang P, Reddy CK (2020) A simple and effective self-supervised contrastive learning framework for aspect detection. arXiv:2009.09107

  • Song Y, Wang J, Jiang T, Liu Z, Rao Y (2019) Attentional encoder network for targeted sentiment classification. arXiv:1902.09314

  • Sun C, Huang L, Qiu X (2019) Utilizing bert for aspect-based sentiment analysis via constructing auxiliary sentence. arXiv:1903.09588

  • Sun Z, Sarma P, Sethares W, Liang Y (2020) Learning relationships between text, audio, and video via deep canonical correlation for multimodal language analysis. Proc AAAI Conf Artif Intell 34:8992–8999

    Google Scholar 

  • Tamkin A, Singh T, Giovanardi D, Goodman N (2020) Investigating transferability in pretrained language models. arXiv:2004.14975

  • Tan H, Bansal M (2019) Lxmert: learning cross-modality encoder representations from transformers. arXiv:1908.07490

  • Tang D, Wei F, Yang N, Zhou M, Liu T, Qin B (2014) Learning sentiment-specific word embedding for twitter sentiment classification. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1555–1565

  • Tang H, Mi Y, Xue F, Cao Y (2021) Graph domain adversarial transfer network for cross-domain sentiment classification. IEEE Access 9:33051–33060

    Article  Google Scholar 

  • Tian H, Gao C, Xiao X, Liu H, He B, Wu H, Wang H, Wu F (2020) Skep: sentiment knowledge enhanced pre-training for sentiment analysis. arXiv:2005.05635

  • Tian Y, Chen G, Song Y (2021). Enhancing aspect-level sentiment analysis with word dependencies. In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pp 3726–3739

  • Tsai YHH, Bai S, Liang PP, Kolter JZ, Morency LP, Salakhutdinov R (2019) Multimodal transformer for unaligned multimodal language sequences. In Proceedings of the conference. In: Association for computational linguistics. Meeting, vol 2019, p 6558. NIH Public Access

  • Tzeng E, Hoffman J, Saenko K, Darrell T (2017) Adversarial discriminative domain adaptation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7167–7176

  • Valdivia A, Luzíón MV, Herrera F (2017) Neutrality in the sentiment analysis problem based on fuzzy majority. In: 2017 IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–6. IEEE

  • Valdivia A, Luzón MV, Cambria E, Herrera F (2018) Consensus vote models for detecting and filtering neutrality in sentiment analysis. Inf Fusion 44:126–135

    Article  Google Scholar 

  • Vlad GA, Tanase MA, Onose C, Cercel DC (2019) Sentence-level propaganda detection in news articles with transfer learning and bert-bilstm-capsule model. In: Proceedings of the second workshop on natural language processing for internet freedom: Censorship, Disinformation, and Propaganda, pp 148–154

  • Wang K, Shen W, Yang Y, Quan X, Wang R (2020a) Relational graph attention network for aspect-based sentiment analysis. arXiv:2004.12362

  • Wang X, Sun X, Yang T, Wang H (2020b) Building a bridge: A method for image-text sarcasm detection without pretraining on image-text data. In: Proceedings of the first international workshop on natural language processing beyond text, pp 19–29

  • Wang Y, Shen Y, Liu Z, Liang PP, Zadeh A, Morency L-P (2019) Words can shift: dynamically adjusting word representations using nonverbal behaviors. Proc AAAI Conf Artif Intell 33:7216–7223

    Google Scholar 

  • Wang Z, Ho S-B, Cambria E (2020) Multi-level fine-scaled sentiment sensing with ambivalence handling. Int J Uncertain Fuzz Knowl-Based Syst 28(04):683–697

    Article  Google Scholar 

  • Wu H, Zhang Z, Shi S, Wu Q, Song H (2022) Phrase dependency relational graph attention network for aspect-based sentiment analysis. Knowl-Based Syst 236:107736

    Article  Google Scholar 

  • Wu Z, Ong DC (2020) Context-guided bert for targeted aspect-based sentiment analysis. arXiv:2010.07523

  • Xiao L, Xue Y, Wang H, Hu X, Gu D, Zhu Y (2022) Exploring fine-grained syntactic information for aspect-based sentiment classification with dual graph neural networks. Neurocomputing 471:48–59

    Article  Google Scholar 

  • Xing FZ, Cambria E, Welsch RE (2018) Natural language based financial forecasting: a survey. Artif Intell Rev 50(1):49–73

    Article  Google Scholar 

  • Xing FZ, Pallucchini F, Cambria E (2019) Cognitive-inspired domain adaptation of sentiment lexicons. Inf Process Manag 56(3):554–564

    Article  Google Scholar 

  • Xu H, Liu B, Shu L, Yu PS (2019) Bert post-training for review reading comprehension and aspect-based sentiment analysis. arXiv:1904.02232

  • Yadav A, Vishwakarma DK (2020) Sentiment analysis using deep learning architectures: a review. Artif Intell Rev 53(6):4335–4385

    Article  Google Scholar 

  • Yang K, Lee D, Whang T, Lee S, Lim H (2019a) Emotionx-ku: Bert-max based contextual emotion classifier. arXiv:1906.11565

  • Yang K, Xu H, Gao K (2020a) Cm-bert: cross-modal bert for text-audio sentiment analysis. In: Proceedings of the 28th ACM international conference on multimedia, pp 521–528

  • Yang M, Jiang Q, Shen Y, Wu Q, Zhao Z, Zhou W (2019) Hierarchical human-like strategy for aspect-level sentiment classification with sentiment linguistic knowledge and reinforcement learning. Neural Netw 117:240–248

    Article  Google Scholar 

  • Yang Y, Uy MCS, Huang A (2020b) Finbert: a pretrained language model for financial communications. arXiv:2006.08097

  • Yang Z, Dai Z, Yang Y, Carbonell J, Salakhutdinov RR, Le QV (2019c) Xlnet: generalized autoregressive pretraining for language understanding. In: Advances in neural information processing systems, 32

  • Yin D, Meng T, Chang KW (2020) Sentibert: a transferable transformer-based architecture for compositional sentiment semantics. arXiv:2005.04114

  • Yu W, Xu H, Meng F, Zhu Y, Ma Y, Wu J, Zou J, Yang K (2020) Ch-sims: a Chinese multimodal sentiment analysis dataset with fine-grained annotation of modality. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 3718–3727

  • Yu W, Xu H, Yuan Z, Wu J (2021) Learning modality-specific representations with self-supervised multi-task learning for multimodal sentiment analysis. arXiv:2102.04830

  • Yuan J, Wu Y, Lu X, Zhao Y, Qin B, Liu T (2020) Recent advances in deep learning based sentiment analysis. Sci China Technol Sci 63(10):1947–1970

    Article  Google Scholar 

  • Yuan J, Zhao Y, Qin B, Liu T (2021) Learning to share by masking the non-shared for multi-domain sentiment classification. arXiv:2104.08480

  • Yue L, Chen W, Li X, Zuo W, Yin M (2019) A survey of sentiment analysis in social media. Knowl Inf Syst 60(2):617–663

    Article  Google Scholar 

  • Zadeh A, Chen M, Poria S, Cambria E, Morency,LP (2017) Tensor fusion network for multimodal sentiment analysis. arXiv:1707.07250

  • Zadeh A, Liang PP, Mazumder N, Poria S, Cambria E, Morency LP (2018) Memory fusion network for multi-view sequential learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 32

  • Zadeh A, Pu P (2018) Multimodal language analysis in the wild: Cmu-mosei dataset and interpretable dynamic fusion graph. In: Proceedings of the 56th annual meeting of the association for computational linguistics (Long Papers)

  • Zhang C, Li Q, Song D (2019a) Aspect-based sentiment classification with aspect-specific graph convolutional networks. arXiv:1909.03477

  • Zhang L, Wang S, Liu B (2018) Deep learning for sentiment analysis: a survey. Wiley Interdiscip Rev 8(4):e1253

    Google Scholar 

  • Zhang Y, Miao D, Wang J (2019b) Hierarchical attention generative adversarial networks for cross-domain sentiment classification. arXiv:1903.11334

  • Zhao A, Yu Y (2021) Knowledge-enabled bert for aspect-based sentiment analysis. In: Knowledge-Based Systems, p 107220

  • Zhao C, Wang S, Li D, Liu X, Yang X, Liu J (2021) Cross-domain sentiment classification via parameter transferring and attention sharing mechanism. Inf Sci 578:281–296

    Article  MathSciNet  Google Scholar 

  • Zhao P, Hou L, Wu O (2020) Modeling sentiment dependencies with graph convolutional networks for aspect-level sentiment classification. Knowl-Based Syst 193:105443

    Article  Google Scholar 

  • Zhou H, Chen L, Shi F, Huang D (2015) Learning bilingual sentiment word embeddings for cross-language sentiment classification. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (vol 1: Long Papers), pp 430–440

  • Zhou J, Tian J, Wang R, Wu Y, Xiao W, He L (2020) Sentix: a sentiment-aware pre-trained model for cross-domain sentiment analysis. In Proceedings of the 28th international conference on computational linguistics, pp 568–579

  • Zhou X, Wan X, Xiao J (2016) Attention-based lstm network for cross-lingual sentiment classification. In: Proceedings of the 2016 conference on empirical methods in natural language processing, pp 247–256

  • Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, Xiong H, He Q (2020) A comprehensive survey on transfer learning. Proc IEEE 109(1):43–76

    Article  Google Scholar 

Download references

Funding

This work was funded by the Fundamental Research Grant Scheme provided by the Ministry of Higher Education of Malaysia under grant number FRGS/1/2019/STG06/UTAR/03/1.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jireh Yi-Le Chan.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was funded by the Fundamental Research Grant Scheme provided by the Ministry of Higher Education of Malaysia under Grant Number FRGS/1/2019/STG06/UTAR/03/1.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chan, J.YL., Bea, K.T., Leow, S.M.H. et al. State of the art: a review of sentiment analysis based on sequential transfer learning. Artif Intell Rev 56, 749–780 (2023). https://doi.org/10.1007/s10462-022-10183-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10462-022-10183-8

Keywords

Navigation