claim
Multi-stage search for retrieval systems is learned end-to-end via a contrastive loss over bi- and cross-encoded sequences, serving as an early example of test-time compute with a Transformer language model.
Authors
Sources
- EdinburghNLP/awesome-hallucination-detection - GitHub github.com via serper
Referenced by nodes (1)
- Transformer concept