Pre-Wiring and Pre-Training: What does a neural network need to learn truly general identity rules?

Open Access
Authors
Publication date 2016
Host editors
  • T.R. Besold
  • A. Bordes
  • A. d'Avila Garcez
  • G. Wayne
Book title Proceedings of the Workshop on Cognitive Computation: Integrating neural and symbolic approaches 2016
Book subtitle co-located with the 30th Annual Conference on Neural Information Processing Systems (NIPS 2016) : Barcelona, Spain, December 9, 2016
Series CEUR Workshop Proceedings
Event Workshop on Cognitive Computation: Integrating neural and symbolic approaches 2016
Article number 4
Number of pages 9
Publisher Aachen: CEUR-WS
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract
In an influential paper, Marcus et al. [1999] claimed that connectionist models cannot account for human success at learning tasks that involved generalization of abstract knowledge such as grammatical rules. This claim triggered a heated debate, centered mostly around variants of the Simple Recurrent Network model [Elman, 1990]. In our work, we revisit this unresolved debate and analyze the underlying issues from a different perspective. We argue that, in order to simulate human-like learning of grammatical rules, a neural network model should not be used as a tabula rasa, but rather, the initial wiring of the neural connections and the experience acquired prior to the actual task should be incorporated into the model. We present two methods that aim to provide such initial state: a manipulation of the initial connections of the network in a cognitively plausible manner (concretely, by implementing a “delay-line” memory), and a pre-training algorithm that incrementally challenges the network with novel stimuli. We implement such techniques in an Echo State Network [Jaeger, 2001], and we show that only when combining both techniques the ESN is able to learn truly general identity rules.
Document type Conference contribution
Language English
Published at http://ceur-ws.org/Vol-1773/CoCoNIPS_2016_paper4.pdf
Other links http://ceur-ws.org/Vol-1773/
Downloads
CoCoNIPS_2016_paper4 (Final published version)
Permalink to this page
Back