透過您的圖書館登入
IP:3.15.194.172
  • 學位論文

基於遞迴神經網路的指代消解

Coreference Resolution Using Recurrent Neural Networks

指導教授 : 陳信希

摘要


指代(或譯作同指涉)消解是自然語言處理的經典未解之問題。我們提出一種全新的先行詞排序模型,利用階層式遞迴神經網路,先用一遞迴網路依文章的語境建造「提及語義」的表達式,再訓練另一個遞迴網路,使其善用剛剛學習出的表達式,搭配注意力機制,偵測照應詞及其指代之先行詞。我們的系統在CoNLL 2012的共享任務中,拿到了目前最高的分數。

並列摘要


Coreference resolution is a classic unsolved problem in natural language processing. We present a novel antecedent ranking model based on hierarchical recurrent neural networks (RNN). The word-level RNN encodes the context into the representation of mention. The mention-level network is trained to learn to exploit these useful representation and few hand-crafted features to detect anaphora and its antecedent by simple attention mechanism. We evaluate our system on CoNLL 2012 shared task and set up a new state-of-the-art.

參考文獻


Clark, K. and Manning, C. D. (2015). Entity-centric coreference resolution with model stacking. In Association of Computational Linguistics (ACL).
Fernandes, E. R., dos Santos, C. N., and Milidiú, R. L. (2014). Latent trees for coreference resolution. Computational Linguistics.
Hochreiter, S. and Schmidhuber, J. (1997). Long short-term memory.
Martschat, S. and Strube, M. (2015). Latent structures for coreference resolution. Transactions of the Association for Computational Linguistics, 3:405–418.
Vinyals, O., Fortunato, M., and Jaitly, N. (2015). Pointer networks. In Advances in Neural Information Processing Systems, pages 2692–2700.

延伸閱讀