Optimization Strategies for BERT-Based Named Entity Recognition

  • Monique Monteiro UFPE
  • Cleber Zanchettin UFPE

Resumo


Transfer learning through language modeling achieved state-of-the-art results for several natural language processing tasks such as named entity recognition, question answering, and sentiment analysis. However, despite these advancements, some tasks still need more specific solutions. This paper explores different approaches to enhance the performance of Named Entity Recognition (NER) in transformer-based models that have been pre-trained for language modeling. We investigate model soups and domain adaptation methods for Portuguese language entity recognition, providing valuable insights into the effectiveness of these methods in NER performance and contributing to the development of more accurate models. We also evaluate NER performance in few/zero-shot learning settings with a causal language model. In particular, we evaluate diverse BERT-based models trained on different datasets considering general and specific domains. Our results show significant improvements when considering model soup techniques and in-domain pretraining compared to within-task pretraining.
Publicado
25/09/2023
Como Citar

Selecione um Formato
MONTEIRO, Monique; ZANCHETTIN, Cleber. Optimization Strategies for BERT-Based Named Entity Recognition. In: BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 12. , 2023, Belo Horizonte/MG. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 80-94. ISSN 2643-6264.