Visible to the public Chinese Coreference Resolution via Bidirectional LSTMs using Word and Token Level Representations

TitleChinese Coreference Resolution via Bidirectional LSTMs using Word and Token Level Representations
Publication TypeConference Paper
Year of Publication2020
AuthorsMing, Kun
Conference Name2020 16th International Conference on Computational Intelligence and Security (CIS)
Date Publishednov
KeywordsBERT, bidirectional LSTM, Bit error rate, Chinese coreference resolution, composability, compositionality, Computational Intelligence, Computational modeling, cryptography, natural language processing, pubcrawl, security, Semantics, Task Analysis
AbstractCoreference resolution is an important task in the field of natural language processing. Most existing methods usually utilize word-level representations, ignoring massive information from the texts. To address this issue, we investigate how to improve Chinese coreference resolution by using span-level semantic representations. Specifically, we propose a model which acquires word and character representations through pre-trained Skip-Gram embeddings and pre-trained BERT, then explicitly leverages span-level information by performing bidirectional LSTMs among above representations. Experiments on CoNLL-2012 shared task have demonstrated that the proposed model achieves 62.95% F1-score, outperforming our baseline methods.
DOI10.1109/CIS52066.2020.00024
Citation Keyming_chinese_2020