Visible to the public Biblio

Filters: Keyword is Chinese coreference resolution  [Clear All Filters]
2021-06-01
Ming, Kun.  2020.  Chinese Coreference Resolution via Bidirectional LSTMs using Word and Token Level Representations. 2020 16th International Conference on Computational Intelligence and Security (CIS). :73–76.
Coreference resolution is an important task in the field of natural language processing. Most existing methods usually utilize word-level representations, ignoring massive information from the texts. To address this issue, we investigate how to improve Chinese coreference resolution by using span-level semantic representations. Specifically, we propose a model which acquires word and character representations through pre-trained Skip-Gram embeddings and pre-trained BERT, then explicitly leverages span-level information by performing bidirectional LSTMs among above representations. Experiments on CoNLL-2012 shared task have demonstrated that the proposed model achieves 62.95% F1-score, outperforming our baseline methods.