Learning vector representations for sentences: The recursive deep learning approach
| Authors | |
|---|---|
| Supervisors | |
| Cosupervisors | |
| Award date | 03-06-2016 |
| ISBN |
|
| Number of pages | 141 |
| Organisations |
|
| Abstract |
According to the principle of compositionality, the meaning of a sentence is computed from the meaning of its parts and the way they are syntactically combined. Unfortunately, unlike formal semantics, distributional semantics has no elegant compositional mechanisms like function application of lambda calculus. Searching for vectorial composition functions is therefore a challenge, which is still unsolved. Thanks to the rapid rise of deep learning, the idea of using neural networks as composition functions came back, resulting in the recursive neural network (RNN) model and a bundle of its extensions. With (near) state-of-the-art performances on a wide range of tasks such as syntactic parsing and sentiment analysis, this approach has become a promising solution to the challenge.
Following this trend, this dissertation aims at extending the RNN model. In one direction, conventional neural networks are replaced by more expressive ones such as Long short term memory units for tree structures. This improves the performances on sentence classification tasks such as sentiment analysis. In another direction, information is allowed to flow in a parse tree not only bottom-up but also top-down such that both the content and context of a constituent can be recursively encoded in vectors. This extends the range of tasks that can be performed, including top-down parsing and semantic role labeling. |
| Document type | PhD thesis |
| Note | Research conducted at: Universiteit van Amsterdam Series: ILLC dissertation series DS-2016-05 |
| Language | English |
| Downloads | |
| Permalink to this page | |