Probing LLMs for Joint Encoding of Linguistic Categories

Open Access
Authors
  • G. Starace
  • K. Papakostas
  • R. Choenni
  • A. Panagiotopoulos
Publication date 2023
Host editors
  • H. Bouamor
  • J. Pino
  • K. Bali
Book title The 2023 Conference on Empirical Methods in Natural Language Processing : Findings of the Association for Computational Linguistics: EMNLP 2023
Book subtitle December 6-10, 2023
ISBN (electronic)
  • 9798891760615
Event 2023 Conference on Empirical Methods in Natural Language Processing
Pages (from-to) 7158-7179
Number of pages 22
Publisher Stroudsburg, PA: Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract

Large Language Models (LLMs) exhibit impressive performance on a range of NLP tasks, due to the general-purpose linguistic knowledge acquired during pretraining. Existing model interpretability research (Tenney et al., 2019) suggests that a linguistic hierarchy emerges in the LLM layers, with lower layers better suited to solving syntactic tasks and higher layers employed for semantic processing. Yet, little is known about how encodings of different linguistic phenomena interact within the models and to what extent processing of linguistically-related categories relies on the same, shared model representations. In this paper, we propose a framework for testing the joint encoding of linguistic categories in LLMs. Focusing on syntax, we find evidence of joint encoding both at the same (related part-of-speech (POS) classes) and different (POS classes and related syntactic dependency relations) levels of linguistic hierarchy. Our cross-lingual experiments show that the same patterns hold across languages in multilingual LLMs.

Document type Conference contribution
Language English
Published at https://doi.org/10.18653/v1/2023.findings-emnlp.476
Other links https://www.scopus.com/pages/publications/85183299679
Downloads
2023.findings-emnlp.476 (Final published version)
Permalink to this page
Back