claim
Large language model hallucinations can lead to legal liability for the organization responsible for the system if the model generates defamatory or discriminatory content.
Authors
Sources
- LLM Hallucinations: Causes, Consequences, Prevention - LLMs llmmodels.org via serper