relation extraction
Also known as: relationship extraction, Relationship extraction, RE
Facts (75)
Sources
Combining large language models with enterprise knowledge graphs frontiersin.org Aug 26, 2024 13 facts
procedureEarly distant supervision approaches to relation extraction use supervised methods to align positive and negative pair relations for pre-training language models, followed by few-shot learning to extract relations.
claimInaccurate Named Entity Recognition and Relation Extraction prompting results can be corrected through active learning techniques (Wu et al., 2022) or by distilling large Pre-trained Language Models into smaller models for specific tasks (Agrawal et al., 2022).
claimRelation extraction (RE) identifies and categorizes relationships between entities in unstructured text to expand knowledge graph structures, while named entity recognition (NER) focuses on recognizing, classifying, and linking entities in text to a knowledge base.
claimSupervised methods for named entity recognition and relation extraction typically involve a pretraining stage followed by zero-shot learning or the use of specialized architectures and training setups.
claimThe authors of 'Combining large language models with enterprise knowledge graphs' identify LLMs, knowledge graph, relation extraction, knowledge graph enrichment, AI, enterprise AI, carbon footprint, and human in the loop as the primary keywords for their research.
claimModeling Named Entity Recognition (NER) or Relation Extraction (RE) as classification problems forces models to predict a specific entity or relation, which leaves little room for uncertainty.
procedureThe process for enriching the life sciences-oriented Sensigrafo knowledge graph involves the following steps: (1) marking entities in PubMed2 documents using the Cogito disambiguator, (2) generating possible relations using a distant supervision module grounded on Sensigrafo, (3) transforming documents into contextualized embeddings using a field-specific pre-trained language model like BioBERT, (4) performing adapter-based fine-tuning for relation extraction using contrastive learning, and (5) ranking predictions by model confidence.
referenceThe paper 'Rescue implicit and long-tail cases: nearest neighbor relation extraction' by Wan et al. (2022) proposes a nearest neighbor approach to relation extraction specifically aimed at addressing implicit and long-tail cases.
procedureMulti-instance learning (MIL), as proposed by Zeng et al. (2015), is a method to address distant supervision noise in Relation Extraction (RE) that groups sentences into bags labeled as positive or negative with respect to a relation, shifting the task from single sentences to bags.
claimDistant supervision (DS) is an automated data labeling technique that aligns knowledge bases with raw corpora to produce annotated data, used to address the lack of large annotated corpora for relation extraction and named entity recognition.
referenceBassignana and Plank (2022) report that cross-dataset and cross-domain setups for Relation Extraction (RE) are particularly deficient in terms of data quality and availability.
referenceRecent literature identifies two primary approaches to named entity recognition and relation extraction: creating large training sets with hand-curated or extensive automatic annotations to fine-tune large language models, or using precise natural language instructions to replace domain knowledge with prompt engineering.
procedureRelation Extraction tasks are often rephrased as question-answering (Levy et al., 2017), which involves injecting latent knowledge contained in relation labels into prompt construction (Chen et al., 2022) and iteratively fine-tuning prompts to enhance the model's ability to focus on semantic cues (Son et al., 2022).
Construction of Knowledge Graphs: State and Challenges - arXiv arxiv.org 10 facts
procedureThe XI Pipeline uses distant supervision and an aggregated piecewise convolution network trained on existing knowledge graph relations for relation extraction.
referenceM. Mintz, S. Bills, R. Snow, and D. Jurafsky published 'Distant supervision for relation extraction without labeled data' in the Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP in 2009.
referenceD. Zeng, K. Liu, Y. Chen, and J. Zhao published 'Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks' in the proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP 2015) in Lisbon, Portugal, in September 2015.
claimRelation extraction is the process of determining relationships among identified entities within a text.
procedureThe AutoKnow system extracts relations using classification models for attribute applicability and a regression model for attribute importance, applied to product profiles and user search, review, or Q&A data.
claimOpen Information Extraction (OpenIE) is a method of relation extraction that extracts relations without a pre-defined set of relations.
claimRelation extraction can convert text snippets into triples, such as transforming the text 'album Syro' into the triple 'dbr:Syro rdf:type dbo:Album'.
procedureText-based knowledge representation involves three main steps: named-entity recognition, entity linking, and relation extraction.
claimKnowledge Extraction is the process of deriving structured information and knowledge from unstructured or semi-structured data using techniques such as named entity recognition, entity linking, relation extraction, and the canonicalization of entity and relation identifiers.
referenceT.H. Nguyen and R. Grishman published 'Relation Extraction: Perspective from Convolutional Neural Networks' in the proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing (VS@NAACL-HLT 2015) in Denver, Colorado, USA, in June 2015.
The construction and refined extraction techniques of knowledge ... nature.com Feb 10, 2026 8 facts
claimRecent frameworks for Relation Extraction (RE) incorporate entity masking and contrastive pretraining to enhance robustness, while PEFT-based methods and tools like OpenNRE provide scalable and adaptable solutions.
procedureThe knowledge extraction process described in the study consists of three main steps: text refinement, entity extraction, and relationship extraction, which are designed to extract structured, high-quality knowledge from unstructured text.
measurementThe fine-tuned model developed in the study achieves substantial gains in relationship extraction accuracy, while the resulting knowledge graph demonstrates strong performance in semantic coherence and operational reasoning assessments.
procedureRelationship extraction is a process for identifying and constructing logical relationships between entities from unstructured text, consisting of two stages: relationship localization and hierarchical matching validation.
claimSentence-level attention mechanisms reduce noise in Relation Extraction (RE) tasks by weighing relevant context.
procedureThe hierarchical matching validation process for relationship extraction operates in two layers: the first layer validates entity alignment against a standardized database and branch/unit composition rules; the second layer performs logical validation by filtering infeasible relationships based on equipment operational range, task time windows, and battlefield physical laws.
claimConvolutional Neural Networks (CNNs) improve Relation Extraction (RE) classification by extracting local features, while distant supervision enables automatic labeling but introduces noise.
claimThe final output of the relationship extraction process is integrated into a domain knowledge system framework to ensure the resulting relationship network supports task simulation.
Unlocking the Potential of Generative AI through Neuro-Symbolic ... arxiv.org Feb 16, 2025 7 facts
referenceRinaldo Lima, Bernard Espinasse, and Frederico Freitas authored the paper 'The impact of semantic linguistic features in relation extraction: A logical relational learning approach', presented at the International Conference on Recent Advances in Natural Language Processing (RANLP 2019) in 2019.
referenceXiaoyan Zhao, Yang Deng, Min Yang, Lingzhi Wang, Rui Zhang, Hong Cheng, Wai Lam, Ying Shen, and Ruifeng Xu published 'A comprehensive survey on relation extraction: Recent advances and new frontiers' in ACM Computing Surveys in 2024.
claimMethods such as Graph Neural Networks (GNNs), Named Entity Recognition (NER), link prediction, and relation extraction fall into the Neuro[Symbolic] category because they leverage symbolic relationships like ontologies or graphs to enhance neural processing.
referenceMengjia Zhou, Donghong Ji, and Fei Li authored the paper 'Relation extraction in dialogues: A deep learning model based on the generality and specialty of dialogue text', published in IEEE/ACM Transactions on Audio, Speech, and Language Processing, 29:2015β2026, in 2021.
referenceNatural language processing (NLP) technologies include retrieval-augmented generation (RAG), sequence-to-sequence models, semantic parsing, named entity recognition (NER), and relation extraction.
referenceTao Wu, Xiaolin You, Xingping Xian, Xiao Pu, Shaojie Qiao, and Chao Wang published 'Towards deep understanding of graph convolutional networks for relation extraction' in Data & Knowledge Engineering in 2024.
claimGraph Neural Networks (GNNs) are used for relation extraction, where they identify and classify semantic relationships between entities to build and enhance knowledge graphs.
A survey on augmenting knowledge graphs (KGs) with large ... link.springer.com Nov 4, 2024 5 facts
procedureThe process of integrating KGs with LLMs begins with data preparation, which involves extracting entities and relationships from KGs using techniques like Named Entity Recognition (NER) and relation extraction.
procedureThe LLM-augmented KG process is structured into two principal stages: (1) synthesizing KGs by applying LLMs to perform coreference resolution, named entity recognition, and relationship extraction to relate entities from input documents; (2) performing tasks on the constructed KG using LLMs, including KG completion to fill gaps, KG question answering to query responses, and KG text generation to develop descriptions of nodes.
claimNamed entity recognition, coreference resolution, and relation extraction are techniques commonly applied to create detailed and accurate knowledge graphs.
referenceIn LLM-augmented Knowledge Graphs, LLMs are used to improve KG representations, encode text or generate facts for KG completion, perform entity discovery and relation extraction for KG construction, describe KG facts in natural language, and connect natural language questions to KG-based answers, as cited in [55, 56, 57].
procedureSemantic parsing, entity linking, and relation extraction are techniques used to implement semantic layers by extracting and inferring critical concepts and relationships from data to feed into LLMs during processing.
LLM-Powered Knowledge Graphs for Enterprise Intelligence and ... arxiv.org Mar 11, 2025 4 facts
measurementThe knowledge-graph-enhanced LLM system achieved 92% accuracy in entity extraction and 89% accuracy in relationship extraction, with contextual enrichment improving task alignment by 15%.
claimThe relation extraction component utilizes Large Language Models (LLMs) with advanced prompt engineering, incorporating both contextual data from the Contextual Retrieval Module (CRM) and extracted entities as input to enhance the precision and relevance of relationship extraction.
procedureThe Entity-Relationship Extraction layer processes output from the Smart-Summarizer to perform Named Entity Extraction (names, locations, dates) and Relation Extraction (e.g., traveling_on, staying_at, attending_event, participating_in).
procedureThe Entity-Relationship Extraction workflow begins with contextual retrieval, followed by entity extraction, and concludes with relationship extraction to ensure an accurate mapping of interactions.
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org 4 facts
referenceDREEAM (Ma et al., 2023) introduces a memory-efficient approach to relation extraction that uses evidence information as a supervisory signal to guide the attention module in assigning high weights to evidence.
referenceBertNet (Hao et al., 2022) improves relation extraction efficiency and accuracy by using a search and re-scoring mechanism that searches a wide entity pair space with minimal relationship definitions.
claimInjecting real-time data into Knowledge Graph and Large Language Model fusion systems increases inference time due to the requirement for complex preprocessing, relationship extraction, and context modeling operations.
claimLarge Language Models (LLMs) assist in Knowledge Graph construction by acting as prompts and generators for entity, relation, and event extraction, as well as performing entity linking and coreference resolution.
KG-RAG: Bridging the Gap Between Knowledge and Creativity - arXiv arxiv.org May 20, 2024 4 facts
claimJointly performing Named Entity Recognition and Relationship Extraction reduces error propagation and improves overall performance in Knowledge Graph construction.
claimNamed Entity Recognition and Relationship Extraction are key tasks for constructing Knowledge Graphs from unstructured text.
claimJointly performing Named Entity Recognition and Relationship Extraction reduces error propagation and improves overall performance in Knowledge Graph construction.
claimNamed Entity Recognition and Relationship Extraction are key tasks for constructing Knowledge Graphs from unstructured text.
Combining Knowledge Graphs and Large Language Models - arXiv arxiv.org Jul 9, 2024 3 facts
claimKGs constructed by text mining utilize subtasks like named entity recognition and relationship extraction to extract graph data from text, but these KGs are limited by the quality and scope of the input data.
claimKnowBERT exhibits better performance on relation extraction, words in context, and entity typing tasks compared to standard BERT.
claimUtilizing LLMs for tasks like relation extraction and property identification in the KG construction process can make the construction more automatic while maintaining accuracy.
A Survey of Incorporating Psychological Theories in LLMs - arXiv arxiv.org 2 facts
claimMaharaj et al. (2023) and Yu et al. (2022) leverage selective attention mechanisms in LLMs to detect hallucinations and extract relations.
referenceXin Miao, Yongqi Li, Shen Zhou, and Tieyun Qian proposed a neuromorphic mechanism for episodic memory retrieval in large language models to generate commonsense counterfactuals for relation extraction, as detailed in their 2024 paper in the Findings of the Association for Computational Linguistics: ACL 2024.
Efficient Knowledge Graph Construction and Retrieval from ... - arXiv arxiv.org Aug 7, 2025 2 facts
referenceDhanachandra Ningthoujam et al. published 'Relation extraction between the clinical entities based on the shortest dependency path based LSTM' as an arXiv preprint in 2019.
claimBuilding a knowledge graph at enterprise scale incurs significant GPU or CPU costs and high latency when relying on Large Language Models or heavyweight NLP pipelines for entity and relation extraction.
Addressing common challenges with knowledge graphs - SciBite scibite.com 2 facts
claimSemantic technologies facilitate the construction of knowledge graphs by enabling data alignment with standards, data harmonization, relation extraction, and schema generation from both unstructured literature and structured data sources.
procedureBuilding a knowledge graph requires four specific steps: aligning data with standards, harmonisation of datasets, extracting relations from the data, and generating the schema.
Knowledge Graphs: Opportunities and Challenges - Springer Nature link.springer.com Apr 3, 2023 2 facts
referenceWang et al. (2018a) proposed a knowledge graph-based information retrieval technology that constructs knowledge graphs by extracting entities from web pages using an open-source relation extraction method and linking those entities with their relationships.
referenceThe three primary methods of knowledge acquisition are relation extraction, entity extraction, and attribute extraction, with attribute extraction functioning as a subset of entity extraction (Fu et al. 2019).
LLM-empowered knowledge graph construction: A survey - arXiv arxiv.org Oct 23, 2025 2 facts
claimSupervised, weakly supervised, and unsupervised relation extraction paradigms are dependent on annotated data and suffer from limited cross-domain generalization, as categorized by Detroja et al. (2023).
referenceThe LLMs4OL framework, developed by Giglou et al. (2023), verified the capacity of Large Language Models for concept identification, relation extraction, and semantic pattern induction in general-purpose domains.
How NebulaGraph Fusion GraphRAG Bridges the Gap Between ... nebula-graph.io Jan 27, 2026 1 fact
claimBuilding a knowledge graph traditionally requires NLP expertise in named entity recognition, relationship extraction, and entity linking, alongside significant volumes of labeled data and model fine-tuning.
Combining Knowledge Graphs With LLMs | Complete Guide - Atlan atlan.com Jan 28, 2026 1 fact
claimRelationship extraction accuracy in knowledge graphs varies by document type, which necessitates domain-specific tuning.
Enterprise AI Requires the Fusion of LLM and Knowledge Graph stardog.com Dec 4, 2024 1 fact
referenceThe Stardog Platform includes infrastructure support for RAG that utilizes an interactive process of Named Entities, Events, and Relationship extraction to automatically complete Knowledge Graphs with document-resident knowledge.
The State of the Art on Knowledge Graph Construction from Text zenodo.org May 5, 2022 1 fact
referenceThe presentation titled 'The State of the Art on Knowledge Graph Construction from Text: Named Entity Recognition and Relation Extraction Perspectives' covers benchmark dataset resources and neural models for knowledge graph construction tasks.
A Knowledge-Graph Based LLM Hallucination Evaluation Framework themoonlight.io 1 fact
procedureThe GraphEval framework constructs a Knowledge Graph from LLM output through a four-step pipeline: (1) processing input text, (2) detecting unique entities, (3) performing coreference resolution to retain only specific references, and (4) extracting relations to form triples of (entity1, relation, entity2).
The State Of The Art On Knowledge Graph Construction From Text nlpsummit.org 1 fact
claimNandana Mihindukulasooriya's research interests include relation extraction and linking, information extraction, knowledge representation and reasoning, and Neuro-Symbolic AI.
Bridging the Gap Between LLMs and Evolving Medical Knowledge arxiv.org Jun 29, 2025 1 fact
referenceBioBERT (Lee et al., 2020), PubMedBERT (Gu et al., 2021), and MedPaLM (Singhal et al., 2023) are domain-specific language models that adapt transformer pre-training to biomedical corpora to improve entity recognition, relation extraction, and multiple-choice QA.