A Linguistically Motivated Analysis of Intonational Phrasing in Text-to-Speech Systems: Revealing Gaps in Syntactic Sensitivity

Open Access
Authors
Publication date 2025
Host editors
  • Gemma Boleda
  • Michael Roth
Book title The 29th Conference on Computational Natural Language Learning (CoNLL 2025) : Proceedings of the Conference
Book subtitle CoNLL 2025 : July 31-August 1, 2025
ISBN (electronic)
  • 9798891762718
Event 29th Conference on Computational Natural Language Learning
Pages (from-to) 126-140
Publisher Kerrville, TX: Association for Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
We analyze the syntactic sensitivity of Text-to-Speech (TTS) systems using methods inspired by psycholinguistic research. Specifically, we focus on the generation of intonational phrase boundaries, which can often be predicted by identifying syntactic boundaries within a sentence. We find that TTS systems struggle to accurately generate intonational phrase boundaries in sentences where syntactic boundaries are ambiguous (e.g., garden path sentences or sentences with attachment ambiguity). In these cases, systems need superficial cues such as commas to place boundaries at the correct positions. In contrast, for sentences with simpler syntactic structures, we find that systems do incorporate syntactic cues beyond surface markers. Finally, we finetune models on sentences without commas at the syntactic boundary positions, encouraging them to focus on more subtle linguistic cues. Our findings indicate that this leads to more distinct intonation patterns that better reflect the underlying structure.
Document type Conference contribution
Language English
Published at https://doi.org/10.18653/v1/2025.conll-1.9
Downloads
2025.conll-1.9 (Final published version)
Permalink to this page
Back