Title | Chinese Coreference Resolution via Bidirectional LSTMs using Word and Token Level Representations |
Publication Type | Conference Paper |
Year of Publication | 2020 |
Authors | Ming, Kun |
Conference Name | 2020 16th International Conference on Computational Intelligence and Security (CIS) |
Date Published | nov |
Keywords | BERT, bidirectional LSTM, Bit error rate, Chinese coreference resolution, composability, compositionality, Computational Intelligence, Computational modeling, cryptography, natural language processing, pubcrawl, security, Semantics, Task Analysis |
Abstract | Coreference resolution is an important task in the field of natural language processing. Most existing methods usually utilize word-level representations, ignoring massive information from the texts. To address this issue, we investigate how to improve Chinese coreference resolution by using span-level semantic representations. Specifically, we propose a model which acquires word and character representations through pre-trained Skip-Gram embeddings and pre-trained BERT, then explicitly leverages span-level information by performing bidirectional LSTMs among above representations. Experiments on CoNLL-2012 shared task have demonstrated that the proposed model achieves 62.95% F1-score, outperforming our baseline methods. |
DOI | 10.1109/CIS52066.2020.00024 |
Citation Key | ming_chinese_2020 |