concept

natural language understanding

Also known as: NLU

Facts (29)

Sources
Combining large language models with enterprise knowledge graphs frontiersin.org Frontiers Aug 26, 2024 7 facts
perspectiveCreating sustainable and effective Natural Language Understanding (NLU) solutions that meet the dynamic requirements of modern enterprises requires balancing computational costs, model flexibility, and training methods, as argued by Faiz et al. (2023).
claimThe proposed knowledge graph expansion pipeline addresses computational efficiency, data quality, evolving knowledge, and adaptive representations in natural language understanding simultaneously.
claimExpert.AI, an enterprise specializing in Natural Language Understanding solutions, relies on Knowledge Graphs that are meticulously created and curated by expert linguists.
claimExpert.AI utilizes a collection of large Knowledge Graphs (KGs) called Sensigrafos, which are built by linguists and domain experts and modified to improve performance in downstream Natural Language Understanding (NLU) tasks.
claimDeveloping enterprise-level Natural Language Understanding (NLU) solutions requires consideration of computational resources and carbon footprint due to the high environmental and economic costs associated with traditional model training, as noted by Patil and Gudivada (2024).
procedureThe procedure for Human-in-the-loop (HITL) methods in Natural Language Understanding (NLU) involves: (1) starting with a small set of annotated data, (2) selecting challenging samples for the model, (3) having humans annotate these samples, (4) updating the model with the new annotations, and (5) repeating the process.
claimSensigrafo is an enterprise Knowledge Graph developed by Expert.AI that focuses on Natural Language Understanding through a machine-oriented lexicon representation.
A survey on augmenting knowledge graphs (KGs) with large ... link.springer.com Springer Nov 4, 2024 5 facts
claimBenchmarks like SimpleQuestions and FreebaseQA provide standardized datasets and evaluation metrics for consistent and comparative assessment of LLMs integrated with knowledge graphs, covering tasks such as natural language understanding, question answering, commonsense reasoning, and knowledge graph completion.
referenceWang A authored 'Glue: A multi-task benchmark and analysis platform for natural language understanding', published as an arXiv preprint in 2018 (arXiv:1804.07461).
claimIntegrating LLMs with KGs improves natural language understanding and generation by allowing models to access structured data within KGs to provide accurate responses that require deep knowledge, such as specific scientific or technical details for historical events.
referenceThe GLUE (General Language Understanding Evaluation) benchmark assesses model performance on natural language understanding tasks, including sentence similarity, textual entailment, and sentiment analysis.
referenceSuperGLUE is an extension of the GLUE benchmark designed to evaluate natural language understanding through more challenging tasks such as causal reasoning and coreference resolution.
Practices, opportunities and challenges in the fusion of knowledge ... frontiersin.org Frontiers 4 facts
referenceWang et al. (2022) explored knowledge prompting in pre-trained language models for natural language understanding.
claimKG-Retrieval NLU is a parameter-efficient framework that leverages multimodal knowledge graph retrieval from VisualSem to enhance natural language understanding.
referenceThe paper 'DKPLM: decomposable knowledge-enhanced pre-trained language model for natural language understanding' was published in the Proceedings of the AAAI Conference on Artificial Intelligence in 2022.
referenceKP-PLM, introduced by Wang J. et al. in 2022, advances knowledge prompting by using dynamic subgraph conversion and dual self-supervised tasks, which improves performance in both full and low-resource natural language understanding (NLU) tasks.
The Synergy of Symbolic and Connectionist AI in LLM ... arxiv.org arXiv 2 facts
claimOpenAI’s GPT-4 is an example of a Large Language Model that demonstrates unprecedented capabilities in natural language understanding and generation, exhibiting robust performance across a range of complex tasks.
claimLAAs adapt flexibly to diverse scenarios by integrating pre-trained language models with natural language understanding.
A Comprehensive Review of Neuro-symbolic AI for Robustness ... link.springer.com Springer Dec 9, 2025 2 facts
claimNeuro-symbolic AI enables natural language understanding tasks such as fact verification, legal analysis, and knowledge base completion through hybrid reasoning over dynamic knowledge graphs.
claimNeural network models utilize large-scale data to learn distributed representations, such as feature vectors, through backpropagation, which enables performance in perception tasks like image recognition, natural language understanding, and speech processing.
Overcoming the limitations of Knowledge Graphs for Decision ... xpertrule.com XpertRule 1 fact
claimKnowledge graphs reduce AI hallucinations and improve natural language understanding by providing necessary context to AI models.
Reducing hallucinations in large language models with custom ... aws.amazon.com Amazon Web Services Nov 26, 2024 1 fact
accountShayan Ray is an Applied Scientist at Amazon Web Services whose research focuses on natural language processing, natural language understanding, natural language generation, conversational AI, task-oriented dialogue systems, and LLM-based agents.
The Synergy of Symbolic and Connectionist AI in LLM-Empowered ... arxiv.org arXiv Jul 11, 2024 1 fact
claimLAAs adapt flexibly to diverse scenarios and expand AI's potential in autonomous operations by integrating pre-trained language models with natural language understanding.
Neuro-Symbolic AI: Explainability, Challenges, and Future Trends arxiv.org arXiv Nov 7, 2024 1 fact
referenceLiu et al. (2022) proposed a neuro-symbolic approach for natural language understanding.
Combining Knowledge Graphs and Large Language Models - arXiv arxiv.org arXiv Jul 9, 2024 1 fact
claimIn 2023, Hu et al. surveyed knowledge-enhanced pre-trained models with a focus on two key tasks in Natural Language Processing: Natural Language Understanding and Natural Language Generation.
A Survey on the Theory and Mechanism of Large Language Models arxiv.org arXiv Mar 12, 2026 1 fact
referenceThe paper 'Shortcut learning of large language models in natural language understanding' was published in Communications of the ACM 67 (1), pp. 110–120.
The Integration of Symbolic and Connectionist AI in LLM-Driven ... econpapers.repec.org Ankit Sharma · Journal of Artificial Intelligence General science 1 fact
claimLarge Language Models (LLMs) exhibit traits of both symbolic and connectionist paradigms and can serve as the backbone for integrating these approaches to improve decision-making, natural language understanding, and autonomy in intelligent agents.
A Survey of Incorporating Psychological Theories in LLMs - arXiv arxiv.org arXiv 1 fact
referenceEmily M. Bender and Alexander Koller published 'Climbing towards NLU: On meaning, form, and understanding in the age of data' in 2020, discussing the limitations of language models regarding meaning and understanding.
Call for Papers: Special Session on KR and Machine Learning kr.org KR 1 fact
claimThe Special Session on KR and Machine Learning at KR2022 welcomes papers on topics including learning symbolic knowledge (ontologies, knowledge graphs, action theories, commonsense knowledge, spatial/temporal theories, preference/causal models), logic-based/relational learning algorithms, machine-learning driven reasoning, neural-symbolic learning, statistical relational learning, multi-agent learning, symbolic reinforcement learning, learning symbolic abstractions from unstructured data, explainable AI, expressive power of learning representations, knowledge-driven natural language understanding and dialogue, knowledge-driven decision making, knowledge-driven intelligent systems for IoT and cybersecurity, and architectures combining data-driven techniques with formal reasoning.