claim
Using off-the-shelf large language models in legal contexts without further training or validation poses significant risks due to high hallucination rates.
Authors
Sources
- EdinburghNLP/awesome-hallucination-detection - GitHub github.com via serper
Referenced by nodes (1)
- Large Language Models concept