Optimizing differentiable relaxations of coreference evaluation metrics
| Authors | |
|---|---|
| Publication date | 2017 |
| Host editors |
|
| Book title | The 21st Conference on Computational Natural Language Learning |
| Book subtitle | Proceedings of the Conference : CoNNL 2017 : August 2-august 4, 2017, Vancouver, Canada |
| ISBN (electronic) |
|
| Event | 21st Conference on Computational Natural Language Learning, CoNLL 2017 |
| Pages (from-to) | 390-399 |
| Number of pages | 10 |
| Publisher | Stroudsburg, PA: Association for Computational Linguistics |
| Organisations |
|
| Abstract |
Coreference evaluation metrics are hard to optimize directly as they are non-differentiable functions, not easily decomposable into elementary decisions. Consequently, most approaches optimize objectives only indirectly related to the end goal, resulting in suboptimal performance. Instead, we propose a differentiable relaxation that lends itself to gradient-based optimisation, thus bypassing the need for reinforcement learning or heuristic modification of cross-entropy. We show that by modifying the training objective of a competitive neural coreference system, we obtain a substantial gain in performance. This suggests that our approach can be regarded as a viable alternative to using reinforcement learning or more computationally expensive imitation learning. |
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.18653/v1/k17-1039 |
| Other links | https://github.com/lephong/diffmetric_coref https://www.scopus.com/pages/publications/85072985881 |
| Downloads |
K17-1039
(Final published version)
|
| Permalink to this page | |