claim
Elastic has developed an LLM observability module that collects prompts, responses, latency metrics, and safety signals into Elasticsearch indices for organizations using the Elastic Stack.
Authors
Sources
- LLM Observability: How to Monitor AI When It Thinks in Tokens | TTMS ttms.com via serper
Referenced by nodes (3)
- LLM observability concept
- prompts concept
- Elastic entity