Some correct error-driven versions of the Constraint Demotion algorithm

Open Access
Authors
Publication date 2009
Journal Linguistic Inquiry
Volume | Issue number 40 | 4
Pages (from-to) 667-686
Organisations
  • Faculty of Humanities (FGw) - Amsterdam Institute for Humanities Research (AIHR) - Amsterdam Center for Language and Communication (ACLC)
Abstract
This article shows that Error-Driven Constraint Demotion (EDCD), an error-driven learning algorithm proposed by Tesar (1995) for Prince and Smolensky’s (1993/2004) version of Optimality Theory, can fail to converge to a correct totally ranked hierarchy of constraints, unlike the earlier non-error-driven learning algorithms proposed by Tesar and Smolensky (1993). The cause of the problem is found in Tesar’s use of "mark-pooling ties," indicating that EDCD can be repaired by assuming Anttila’s (1997) "permuting ties" instead. Proofs show, and simulations confirm, that totally ranked hierarchies can indeed be found by both this repaired version of EDCD and Boersma’s (1998) Minimal Gradual Learning Algorithm.
Document type Article
Published at https://doi.org/10.1162/ling.2009.40.4.667
Published at http://muse.jhu.edu/journals/linguistic_inquiry/v040/40.4.boersma.pdf
Downloads
310193.pdf (Final published version)
Permalink to this page
Back