claim
Large language models can generate text that appears authoritative in register, structure, and terminology while containing factually incorrect claims regarding dosages, legal precedents, regulatory requirements, or technical specifications.
Authors
Sources
- Hallucination Causes: Why Language Models Fabricate Facts mbrenndoerfer.com via serper
Referenced by nodes (1)
- Large Language Models concept