Enhancing PLM Performance on Labour Market Tasks via Instruction-based Finetuning and Prompt-tuning with Rules

Open Access
Authors
Publication date 2023
Host editors
  • M. Kaya
  • T. Bogers
  • D. Graus
  • C. Johnson
  • J.-J. Decorte
Book title Proceedings of the 3rd Workshop on Recommender Systems for Human Resources (RecSys in HR 2023)
Book subtitle co-located with the 17th ACM Conference on Recommender Systems (RecSys 2023) : Singapore, Singapore, 18th-22nd September 2023
Series CEUR Workshop Proceedings
Event 3rd Workshop on Recommender Systems for Human Resources, RECSYS IN HR 2023
Article number 4
Number of pages 10
Publisher Aachen: CEUR-WS
Organisations
  • Faculty of Economics and Business (FEB) - Amsterdam Business School Research Institute (ABS-RI)
Abstract

The increased digitization of the labour market has given researchers, educators, and companies the means to analyze and better understand the labour market. However, labour market resources, although available in high volumes, tend to be unstructured, and as such, research towards methodologies for the identification, linking, and extraction of entities becomes more and more important. Against the backdrop of this quest for better labour market representations, resource constraints and the unavailability of large-scale annotated data cause a reliance on human domain experts. We demonstrate the effectiveness of prompt-based tuning of pre-trained language models (PLM) in labour market specific applications. Our results indicate that cost-efficient methods such as PTR and instruction tuning without exemplars can significantly increase the performance of PLMs on downstream labour market applications without introducing additional model layers, manual annotations, and data augmentation.

Document type Conference contribution
Language English
Published at https://ceur-ws.org/Vol-3490/RecSysHR2023-paper_4.pdf
Other links https://ceur-ws.org/Vol-3490
Downloads
RecSysHR2023-paper_4-1 (Final published version)
Permalink to this page
Back