Native Observability
See everything, from request to response. OpenTelemetry instrumentation built into the execution engine. Distributed traces connect your API request through workflow steps to every LLM call and tool invocation. No separate monitoring tools needed.
Built-in Tracing
OpenTelemetry instrumentation built into the runtime. Every workflow step, LLM call, and tool invocation is automatically traced. No manual instrumentation needed.
End-to-End Visibility
Distributed traces connect your API request through all workflow steps. See the complete execution path. Understand where time is spent and where failures occur.
Rich Context
Every span includes inputs, outputs, and metadata. Inspect LLM prompts and responses. View tool invocation details. Debug with complete context.
Performance Insights
Identify bottlenecks with waterfall views. Track P50, P95, P99 latencies. Monitor LLM token usage and costs. Optimize based on real data.