Skip to main content

A Hybrid Neural Network RBERT-C Based on Pre-trained RoBERTa and CNN for User Intent Classification

  • Conference paper
  • First Online:
Neural Computing for Advanced Applications (NCAA 2020)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1265))

Included in the following conference series:

Abstract

User intent classification plays a critical role in identifying the interests of users in question-answering and spoken dialog systems. The question texts of these systems are usually short and their conveyed semantic information are frequently insufficient. Therefore, the accuracy of user intent classification related to user satisfaction may be affected. To address the problem, this paper proposes a hybrid neural network named RBERT-C for text classification to capture user intent. The network uses the Chinese pre-trained RoBERTa to initialize representation layer parameters. Then, it obtains question representations through a bidirectional transformer structure and extracts essential features using a Convolutional Neural Network after question representation modeling. The evaluation is based on the publicly available dataset ECDT containing 3736 labeled sentences. Experimental result indicates that our model RBERT-C achieves a F1 score of 0.96 and an accuracy of 0.96, outperforming a number of baseline methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Jokinen, K., Mctear, M.F.: Spoken dialogue systems. Synth. Lect. Hum. Lang. Technol. 2(1), 151 (2009)

    Google Scholar 

  2. Liu, J., Pasupat, P., Wang, Y., Cyphers, S., Glass, J.: Query understanding enhanced by hierarchical parsing structures. In: IEEE Workshop on Automatic Speech Recognition and Understanding, pp. 72–77 (2013)

    Google Scholar 

  3. Hao, T., Xie, W.X., Wu, Q.Y., Weng, H., Qu, Y.Y.: Leveraging question target word features through semantic relation expansion for answer type classification. Knowl.-Based Syst. 133, 43–52 (2017)

    Article  Google Scholar 

  4. Kim, Y.: Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical, pp. 1746–1751 (2014)

    Google Scholar 

  5. Xu, P., Sarikaya, R.: Convolutional neural network based triangular CRF for joint intent detection and slot filling. In: IEEE Workshop on Automatic Speech Recognition and Understanding, pp. 78–83 (2013)

    Google Scholar 

  6. Liu, B., Lane, I.: Attention-based recurrent neural network models for joint intent detection and slot filling. In: INTERSPEECH, pp. 685–689 (2016)

    Google Scholar 

  7. Hinton, G.E.: Learning distributed representations of concepts. In: Proceedings of the Eighth Annual Conference of the Cognitive Science Society, vol. 1, p. 12 (1989)

    Google Scholar 

  8. Devlin, J., Chang, M.W., Lee, K., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  9. Liu, Y., Ott, M., Goyal, N., et al.: RoBERTa: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)

  10. Cui, Y., Che, W., Liu, T., et al.: Pre-training with whole word masking for Chinese BERT. arXiv preprint arXiv:1906.08101 (2019)

  11. Hao, T., Xie, W., Xu, F.: A WordNet expansion-based approach for question targets identification and classification. In: Sun, M., Liu, Z., Zhang, M., Liu, Y. (eds.) CCL/NLP-NABD -2015. LNCS (LNAI), vol. 9427, pp. 333–344. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25816-4_27

    Chapter  Google Scholar 

  12. Xie, W., Gao, D., Hao, T.: A feature extraction and expansion-based approach for question target identification and classification. In: Wen, J., Nie, J., Ruan, T., Liu, Y., Qian, T. (eds.) CCIR 2017. LNCS, vol. 10390, pp. 249–260. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68699-8_20

    Chapter  Google Scholar 

  13. Liu, J., Yang, Y., Lv, S., et al.: Attention‑based BiGRU‑CNN for Chinese question classifcation. J Ambient Intell. Hum. Comput. https://doi.org/10.1007/s12652-019-01344-9 (2019)

  14. Johnson, R., Zhang, T.: Deep pyramid convolutional neural networks for text categorization. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pp. 562–570 (2017)

    Google Scholar 

  15. Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101 (2016)

  16. Ravuri, S.V., Stolcke, A.: Recurrent neural network and LSTM models for lexical utterance classification. In: INTERSPEECH-2015, pp. 135–139 (2015)

    Google Scholar 

  17. Chen, Z., Tang, Y., Zhang, Z., et al.: Sentiment-aware short text classification based on convolutional neural network and attention. In: IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI), pp. 1172–1179 (2019)

    Google Scholar 

  18. Pappas, N., Popescu-Belis, A.: Multilingual hierarchical attention networks for document classification. arXiv preprint arXiv:1707.00896 (2017)

  19. Guo, D., Tur, G., Yih, W., Zweig, G.: Joint semantic utterance classification and slot filling with recursive neural networks. In: IEEE Spoken Language Technology Workshop, pp. 554–559 (2014)

    Google Scholar 

  20. Liu, B., Lane, I.: Joint online spoken language understanding and language modeling with recurrent neural networks. arXiv preprint arXiv:1609.01462 (2016)

  21. Chen, Q., Zhuo, Z., Wang, W.: BERT for joint intent classification and slot filling. arXiv preprint arXiv:1902.10909 (2019)

  22. Tur, G., Hakkani-Tur, D., Heck, L.: What is left to be understood in ATIS? In: Spoken Language Technology Workshop, pp. 19–24 (2010)

    Google Scholar 

  23. Coucke, A., Saade, A., Ball, A., et al.: Snips voice platform: an embedded spoken language understanding system for private-by-design voice interfaces. arXiv preprint arXiv:1805.10190 (2018)

  24. He, C., Chen, S., Huang, S., et al.: Using convolutional neural network with BERT for intent determination. In: International Conference on Asian Language Processing (IALP), pp. 65–70 (2019)

    Google Scholar 

  25. Khalil, T., Kielczewski, K., Chouliaras, G.C., et al.: Cross-lingual intent classification in a low resource industrial setting. In: International Joint Conference on Natural Language Processing, pp. 6418–6423 (2019)

    Google Scholar 

  26. Zhang, W.-N., Chen, Z., Che, W., Hu, G., Liu, T.: The first evaluation of Chinese human computer dialogue technology. arXiv preprint arXiv:1709.10217 (2017)

  27. Xie, W., Gao, D., Ding, R., Hao, T.: A feature-enriched method for user intent classification by leveraging semantic tag expansion. In: Zhang, M., Ng, V., Zhao, D., Li, S., Zan, H. (eds.) NLPCC 2018. LNCS (LNAI), vol. 11109, pp. 224–234. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99501-4_19

    Chapter  Google Scholar 

  28. Chen, S., Zheng, B., Hao, T.: Capsule-based bidirectional gated recurrent unit networks for question target classification. In: Zhang, S., Liu, T.-Y., Li, X., Guo, J., Li, C. (eds.) CCIR 2018. LNCS, vol. 11168, pp. 67–77. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01012-6_6

    Chapter  Google Scholar 

  29. Zhao, W., Ye, J., Yang, M., et al.: Investigating capsule networks with dynamic routing for text classification. arXiv preprint arXiv:1504.00538 (2018)

  30. Wu, Y., Schuster, M., Chen, Z., et al.: Google’s neural machine translation system: bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144 (2016)

  31. Jawahar, G., Sagot, B., Seddah, D., et al.: What does BERT learn about the structure of language? In: Annual Meeting of the Association for Computational Linguistics (2019)

    Google Scholar 

Download references

Acknowledgements

This work was supported in part by the National Key R&D Program of China (2018YFB1003800, 2018YFB1003805), China University Production Innovation Research Fund Project (2018A01007), Philosophy and Social science planning Project of Guangdong Province (GD18CJY05), and National Natural Science Foundation of China (61772146, 61832004).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tianyong Hao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Y., Liu, H., Wong, LP., Lee, LK., Zhang, H., Hao, T. (2020). A Hybrid Neural Network RBERT-C Based on Pre-trained RoBERTa and CNN for User Intent Classification. In: Zhang, H., Zhang, Z., Wu, Z., Hao, T. (eds) Neural Computing for Advanced Applications. NCAA 2020. Communications in Computer and Information Science, vol 1265. Springer, Singapore. https://doi.org/10.1007/978-981-15-7670-6_26

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-7670-6_26

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-7669-0

  • Online ISBN: 978-981-15-7670-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics