Title | Research on Relation Extraction of Fusion Entity Enhancement and Shortest Dependency Path based on BERT |
Publication Type | Conference Paper |
Year of Publication | 2022 |
Authors | Sun, Zeyu, Zhang, Chi |
Conference Name | 2022 IEEE 10th Joint International Information Technology and Artificial Intelligence Conference (ITAIC) |
Keywords | BERT, Bit error rate, composability, compositionality, Cyber Dependencies, Deep Learning, feature extraction, Human Behavior, human factors, Metrics, natural language processing, pubcrawl, relation extraction, resilience, Resiliency, Scalability, Semantics, shortest dependency path, Syntactics, Transformers |
Abstract | Deep learning models rely on single word features and location features of text to achieve good results in text relation extraction tasks. However, previous studies have failed to make full use of semantic information contained in sentence dependency syntax trees, and data sparseness and noise propagation still affect classification models. The BERT(Bidirectional Encoder Representations from Transformers) pretrained language model provides a better representation of natural language processing tasks. And entity enhancement methods have been proved to be effective in relation extraction tasks. Therefore, this paper proposes a combination of the shortest dependency path and entity-enhanced BERT pre-training language model for model construction to reduce the impact of noise terms on the classification model and obtain more semantically expressive feature representation. The algorithm is tested on SemEval-2010 Task 8 English relation extraction dataset, and the F1 value of the final experiment can reach 0. 881. |
DOI | 10.1109/ITAIC54216.2022.9836910 |
Citation Key | sun_research_2022 |