A multiresolution model of rhythmic expectancy

Open Access
Authors
Publication date 2008
Host editors
  • K. Miyazaki
  • Y. Hiraga
  • M. Adachi
  • Y. Nakajima
  • M. Tsuzaki
Book title Proceedings of the 10th International Conference on Music Perception and Cognition (ICMPC 10), Sapporo, Japan
ISBN
  • 9784990420802
Event 10th International Conference on Music Perception and Cognition (ICMPC 10), Sapporo, Japan
Pages (from-to) 360-365
Publisher Sapporo: Hokkaido University
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
We describe a computational model of rhythmic cognition that predicts expected onset times. A dynamic representation of musical rhythm, the multiresolution analysis using the continuous wavelet transform is used. This representation decomposes the temporal structure of a musical rhythm into time varying frequency components in the rhythmic frequency range (sample rate of 200Hz). Both expressive timing and temporal structure (score times) contribute in an integrated fashion to determine the temporal expectancies. Future expected times are computed using peaks in the accumulation of time-frequency ridges. This accumulation at the edge of the analysed time window forms a dynamic expectancy. We evaluate this model using data sets of expressively timed (or performed) and generated musical rhythms, by its ability to produce expectancy profiles which correspond to metrical profiles. The results show that rhythms of two different meters are able to be distinguished. Such a representation indicates that a bottom-up, data-oriented process (or a non-cognitive model) is able to reveal durations which match metrical structure from realistic musical examples. This then helps to clarify the role of schematic expectancy (top-down) and it's contribution to the formation of musical expectation.
Document type Conference contribution
Published at http://cf.hum.uva.nl/mmm/papers/smith-honing-2008a.pdf
Downloads
Permalink to this page
Back