procedure
The annotation process for LLM outputs involves tasking volunteer doctors to classify sub-sections of output for hallucinations or omissions based on a specific taxonomy and providing free-text explanations for their classifications.
Authors
Sources
- A framework to assess clinical safety and hallucination rates of LLMs ... www.nature.com via serper
Referenced by nodes (2)
- hallucination concept
- emissions concept