formula
In the probabilistic generative framework for Large Language Models, an LLM is modeled as a probabilistic generator Pθ(y|x) parameterized by θ, where x is the input prompt and y is the generated output; hallucinations emerge when the model assigns a higher probability to an incorrect or ungrounded generation sequence than to a factually grounded alternative.

Authors

Sources

Referenced by nodes (1)