Impact of Task Adapting on Transformer Models for Targeted Sentiment Analysis in Croatian Headlines

Open Access
Authors
Publication date 2024
Host editors
  • N. Calzolari
  • M.-Y. Kan
  • V. Hoste
  • A. Lenci
  • S. Sakti
  • N. Xue
Book title The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Book subtitle main conference proceedings : 20-25 May, 2024, Torino, Italia
ISBN (electronic)
  • 9782493814104
Series COLING
Event 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Pages (from-to) 8662–8674
Publisher ELRA Language Resources Association
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
Transformer models, such as BERT, are often taken off-the-shelf and then fine-tuned on a downstream task. Although this is sufficient for many tasks, low-resource settings require special attention. We demonstrate an approach of performing an extra stage of self-supervised task-adaptive pre-training to a number of Croatian-supporting Transformer models. In particular, we focus on approaches to language, domain, and task adaptation. The task in question is targeted sentiment analysis for Croatian news headlines. We produce new state-of-the-art results (F1 = 0.781), but the highest performing model still struggles with irony and implicature. Overall, we find that task-adaptive pre-training benefits massively multilingual models but not Croatian-dominant models.
Document type Conference contribution
Language English
Published at https://aclanthology.org/2024.lrec-main.760
Downloads
2024.lrec-main.760 (Final published version)
Permalink to this page
Back