Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
A Novel Approach for Analyzing Entity Linking Between Words and Entities for a Knowledge Base Using an Attention-Based Bilinear Joint Learning and Weighted Summation Model
oleh: Shuanghu Luo, Penglong Wang, Min Cao
Format: | Article |
---|---|
Diterbitkan: | IEEE 2020-01-01 |
Deskripsi
Entity linking (EL) is a task about natural language that links mentions of entities in text to corresponding entities that are in a knowledge base. Potential applications include question-answering systems, information extraction and knowledge base population (KBP). The key to structuring an EL system that has high quality involves creating careful representations of words and entities. However, a hypothesis that whole words have the same weight in their context exists in most previous methods, which causes the meanings of words to be biased. In this paper, a novel approach to analyze entity linking between words and entities for a knowledge base using attention-based bilinear joint learning is proposed. First, the approach designs a novel encoding method to model entities and words in EL. The method learns words and entities in a joint way and uses an attention mechanism to obtain different importance values in the context. Second, the approach introduces a weighted summation method to form the textual context and introduces the method with same line of reasoning to model coherence to improve ranking the features. Finally, the approach employs a pairwise boosting regression tree (PBRT) to rank the candidate entities. During the ranking, the approach takes features constructed with a weighted summation model and conventional EL features as the input. Through experiments, it demonstrates that compared with other state-of-the-art methods, the proposed model learns embeddings efficiently and improves EL performance. Our approach achieves progressive results on CoNLL and TAC 2010 datasets.