claim
LLaMA 2 (13B) benefits significantly from Chain-of-Thought (CoT) prompting, though ambiguous instructions can lead to hallucinations.

Authors

Sources

Referenced by nodes (1)