Compositionality for recursive neural networks
| Authors | |
|---|---|
| Publication date | 06-2019 |
| Journal | Journal of Applied Logics - IfCoLog Journal of Logics and their Applications |
| Event | 13th International Workshop on Neural-Symbolic Learning and Reasoning |
| Volume | Issue number | 6 | 4 |
| Pages (from-to) | 709-724 |
| Number of pages | 16 |
| Organisations |
|
| Abstract |
Modelling compositionality has been a longstanding area of research in the field of vector space semantics. The categorical approach to compositionality maps grammar onto vector spaces in a principled way, but comes under fire for requiring the formation of very high-dimensional matrices and tensors, and therefore being computationally infeasible. In this paper I show how a linear simplification of recursive neural tensor network models can be mapped directly onto the categorical approach, giving a way of computing the required matrices and tensors. This mapping suggests a number of lines of research for both categorical compositional vector space models of meaning and for recursive neural network models of compositionality. |
| Document type | Article |
| Note | In special issue: Neural-Symbolic Learning and Reasoning (NeSy'18) |
| Language | English |
| Published at | https://www.collegepublications.co.uk/ifcolog/?00033 |
| Other links | https://www.scopus.com/pages/publications/85071265419 |
| Permalink to this page | |
