Document-level Relation Extraction with Relation CorrelationsShow others and affiliations
2024 (English)In: Neural Networks, ISSN 0893-6080, E-ISSN 1879-2782, Vol. 171, p. 14-24Article in journal (Refereed) Published
Abstract [en]
Document-level relation extraction faces two often overlooked challenges: long-tail problem and multi-label problem. Previous work focuses mainly on obtaining better contextual representations for entity pairs, hardly address the above challenges. In this paper, we analyze the co-occurrence correlation of relations, and introduce it into the document-level relation extraction task for the first time. We argue that the correlations can not only transfer knowledge between data-rich relations and data-scarce ones to assist in the training of long-tailed relations, but also reflect semantic distance guiding the classifier to identify semantically close relations for multi-label entity pairs. Specifically, we use relation embedding as a medium, and propose two co-occurrence prediction sub-tasks from both coarse- and fine-grained perspectives to capture relation correlations. Finally, the learned correlation-aware embeddings are used to guide the extraction of relational facts. Substantial experiments on two popular datasets (i.e., DocRED and DWIE) are conducted, and our method achieves superior results compared to baselines. Insightful analysis also demonstrates the potential of relation correlations to address the above challenges. The data and code are released at https://github.com/RidongHan/DocRE-Co-Occur. © 2023 Elsevier Ltd
Place, publisher, year, edition, pages
Oxford: Elsevier, 2024. Vol. 171, p. 14-24
Keywords [en]
Co-occurrence, Document-level, Multi-task, Relation Correlations, Relation Extraction
National Category
Language Technology (Computational Linguistics)
Identifiers
URN: urn:nbn:se:hh:diva-52317DOI: 10.1016/j.neunet.2023.11.062Scopus ID: 2-s2.0-85179586274OAI: oai:DiVA.org:hh-52317DiVA, id: diva2:1822493
Note
Funding: This work is supported by the National Natural Science Foundation of China under grant No. 61872163 and 61806084, Jilin Province Key Scientific and Technological Research and Development Project under grant No. 20210201131GX, and Jilin Provincial Education Department Project under grant No. JJKH20190160KJ.
2023-12-222023-12-222024-01-17Bibliographically approved