Comparison of BERT and Snips models for natural language understanding for the A.D.A. virtual assistant

  • Leonardo Costa Santos USP
  • Antonio Deusany de Carvalho Junior USP
  • Alfredo Goldman USP

Abstract


Natural Language Understanding is an important area of Natural Language Processing, concerned with understanding and extracting semantic information from text or speech. It is an essential component of any modern virtual assistent being responsible for identifying the intent of the user and the objects of their speech. In this work we compared two models, BERT and Snips, for natural language processing. These models were tested with the goal of being integrated into the A.D.A. virtual assistant, perfoming the tasks of intent classification and slot filling. As preliminary results, we obtained high preci- sion in both tasks with the tested solutions. Therefore, we concluded that both solutions are adequate for integration with A.D.A. depending on the expected results.
Keywords: Big Data, Clouds, Grids, Clusters, and Peer-to-Peer Computing, Mobile, Pervasive, and Embedded Computing, Computer Networks and Protocols for High-Performance Communication

References

Chen, Q., Zhuo, Z., and Wang, W. (2019). Bert for joint intent classification and slot filling. arXiv. eprint 1902.10909.

Coucke, A., Saade, A., Ball, A., Bluche, T., Caulier, A., Leroy, D., Doumouro, C., Gisselbrecht, T., Caltagirone, F., Lavril, T., et al. (2018). Snips voice platform: an embedded spoken language understanding system for private-by-design voice interfaces. arXiv preprint arXiv:1805.10190, pages 12–16.

Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.

Freire, F., Rosa, T., Feulo, G., Elmadjian, C., Cordeiro, R., Moura, S., Andrade, A., de Omena, L. A., Vicente, A., Marques, F., Sheffer, A., Hideki, O., Nascimento, P., Cordeiro, D., and Goldman, A. (2020). Toward Development of A.D.A. – Advanced Distributed Assistant. In Anais do XXI Simpósio em Sistemas Computacionais de Alto Desempenho, pages 203–214, Porto Alegre, RS, Brasil. SBC.

Jurafsky, D. and Martin, J. H. (2009). Speech and Language Processing (2Nd Edition). Prentice-Hall, Inc., Upper Saddle River, NJ, USA.

Souza, F., Nogueira, R., and Lotufo, R. (2020). BERTimbau: pretrained BERT models for Brazilian Portuguese. In 9th Brazilian Conference on Intelligent Systems, BRACIS, Rio Grande do Sul, Brazil, October 20-23 (to appear).

Wolf, T., Debut, L., Sanh, V., Chaumond, J., Delangue, C., Moi, A., Cistac, P., Rault, T., Louf, R., Funtowicz, M., Davison, J., Shleifer, S., von Platen, P., Ma, C., Jernite, Y., Plu, J., Xu, C., Scao, T. L., Gugger, S., Drame, M., Lhoest, Q., and Rush, A. M. (2020). Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 38–45, Online. Association for Computational Linguistics.
Published
2021-05-06
SANTOS, Leonardo Costa; DE CARVALHO JUNIOR, Antonio Deusany; GOLDMAN, Alfredo. Comparison of BERT and Snips models for natural language understanding for the A.D.A. virtual assistant. In: REGIONAL SCHOOL OF HIGH PERFORMANCE COMPUTING FROM SÃO PAULO (ERAD-SP), 12. , 2021, Evento Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 . p. 33-36. DOI: https://doi.org/10.5753/eradsp.2021.16699.

Most read articles by the same author(s)

1 2 3 4 > >>