Latitude Telemetry instruments your AI application and sends traces to Latitude. Built entirely on OpenTelemetry, it works alongside your existing observability stack (Datadog, Sentry, Jaeger, etc.) without conflicts or vendor lock-in. Once connected, every LLM execution becomes a trace in Latitude that you can inspect in the Traces view, enrich with scores and annotations, and evaluate with Evaluations.Documentation Index
Fetch the complete documentation index at: https://docs.latitude.so/llms.txt
Use this file to discover all available pages before exploring further.
Quick Start
One function sets up everything: auto-instrumentation, the Latitude exporter, and async context propagation:- TypeScript
- Python
Adding Context with capture()
Auto-instrumentation traces LLM calls without any extra code. Use capture() when you want to attach business context such as user IDs, session IDs, tags, or metadata to group and filter traces in Latitude.
- TypeScript
- Python
capture() does not create spans. It only attaches context to spans created by auto-instrumentation. Wrap the request or agent entrypoint once; you don’t need to wrap every internal step.Streaming
When streaming responses, consume the stream inside thecapture() callback so the span duration covers the full operation and child spans nest correctly:
- TypeScript
- Python
How It Fits Into Your Stack
Latitude Telemetry is built on OpenTelemetry standards:- Auto-instrumentation patches your LLM SDK (OpenAI, Anthropic, etc.) to emit spans for every call.
LatitudeSpanProcessorfilters for LLM-relevant spans (gen_ai.*,ai.*,openinference.*attributes) and exports them to Latitude via OTLP.capture()uses OpenTelemetry’s nativecontext.with()to attach Latitude-specific attributes (user, session, tags) to spans within its scope.
LatitudeSpanProcessor alongside your existing processors. See the TypeScript SDK or Python SDK reference for advanced setup.
Supported Integrations
Providers
| Provider | Instrumentation | Package (TS) | Package (Python) |
|---|---|---|---|
| OpenAI | "openai" | openai | openai |
| Anthropic | "anthropic" | @anthropic-ai/sdk | anthropic |
| Amazon Bedrock | "bedrock" | @aws-sdk/client-bedrock-runtime | boto3 |
| Cohere | "cohere" | cohere-ai | cohere |
| Together AI | "togetherai" | together-ai | together |
| Vertex AI | "vertexai" | @google-cloud/vertexai | google-cloud-aiplatform |
| Google AI Platform | "aiplatform" | @google-cloud/aiplatform | google-cloud-aiplatform |
| Azure OpenAI | "openai" | openai | openai |
Frameworks
| Framework | Instrumentation | Package (TS) | Package (Python) |
|---|---|---|---|
| Vercel AI SDK | - | ai | - |
| LangChain | "langchain" | langchain | langchain-core |
| LlamaIndex | "llamaindex" | llamaindex | llama-index |
Next Steps
- SDK Reference: TypeScript SDK · Python SDK
- Other languages: OTLP Exporter — connect from Go, Java, Ruby, .NET, or any OpenTelemetry-supported language
- Understand your data: Traces · Sessions
- Act on your data: Evaluations · Annotations