Latitude Telemetry instruments your AI application and sends traces to Latitude. Built entirely on OpenTelemetry, it works alongside your existing observability stack (Datadog, Sentry, Jaeger, etc.) without conflicts or vendor lock-in. Once connected, every LLM execution becomes a trace in Latitude that you can inspect in the Traces view, enrich with scores and annotations, and evaluate with Evaluations.Documentation Index
Fetch the complete documentation index at: https://docs.latitude.so/llms.txt
Use this file to discover all available pages before exploring further.
Agentic installation
The fastest way to add Latitude is to let your coding agent (Claude Code, Cursor, Windsurf, etc.) install it for you. The Latitude skill guides the agent through codebase discovery (existing OpenTelemetry, conflicting LLM-observability vendors, which LLM SDKs are in use, where LLM calls actually happen), picks the right install path, places initialization correctly, and verifies that traces land in your project.Ask your coding agent
Paste this prompt into your agent:Install the Latitude AI skill from github.com/latitude-dev/skills and use it to add tracing to this application with Latitude following best practices.The agent will fetch the skill, read your codebase, ask only the questions it can’t answer from the code, and produce a working install.
Install the skill manually
If you prefer to install the skill ahead of time (no global setup needed —npx runs it directly):
Add tracing to this application with Latitude following best practices.The skill covers TypeScript and Python, the providers and frameworks listed in Supported Integrations, and audits existing OpenTelemetry setups for compatibility.
Manual installation
One SDK bootstrap class sets up everything: auto-instrumentation, the Latitude exporter, and async context propagation:- TypeScript
- Python
Adding Context with capture()
Auto-instrumentation traces LLM calls without any extra code. Use capture() when you want to attach business context such as user IDs, session IDs, tags, or metadata to group and filter traces in Latitude.
- TypeScript
- Python
capture() does not create spans. It only attaches context to spans created by auto-instrumentation. Wrap the request or agent entrypoint once; you don’t need to wrap every internal step.Streaming
When streaming responses, consume the stream inside thecapture() callback so the span duration covers the full operation and child spans nest correctly:
- TypeScript
- Python
How It Fits Into Your Stack
Latitude Telemetry is built on OpenTelemetry standards:- Auto-instrumentation patches your LLM SDK (OpenAI, Anthropic, etc.) to emit spans for every call.
LatitudeSpanProcessorfilters for LLM-relevant spans (gen_ai.*,ai.*,openinference.*attributes) and exports them to Latitude via OTLP.capture()uses OpenTelemetry’s nativecontext.with()to attach Latitude-specific attributes (user, session, tags) to spans within its scope.
Latitude second so Latitude can attach to the existing provider when possible. You can also add LatitudeSpanProcessor explicitly alongside existing processors. See the TypeScript SDK or Python SDK reference for advanced setup.
Supported Integrations
Providers
| Provider | Instrumentation | Package (TS) | Package (Python) |
|---|---|---|---|
| OpenAI | "openai" | openai | openai |
| Anthropic | "anthropic" | @anthropic-ai/sdk | anthropic |
| Amazon Bedrock | "bedrock" | @aws-sdk/client-bedrock-runtime | boto3 |
| Cohere | "cohere" | cohere-ai | cohere |
| Together AI | "togetherai" | together-ai | together |
| Vertex AI | "vertexai" | @google-cloud/vertexai | google-cloud-aiplatform |
| Google AI Platform | "aiplatform" | @google-cloud/aiplatform | google-cloud-aiplatform |
| Azure OpenAI | "openai" | openai | openai |
Frameworks
| Framework | Instrumentation | Package (TS) | Package (Python) |
|---|---|---|---|
| Vercel AI SDK | - | ai | - |
| OpenAI Agents SDK | "openai-agents" | @openai/agents | openai-agents |
| LangChain | "langchain" | langchain | langchain-core |
| LlamaIndex | "llamaindex" | llamaindex | llama-index |
| Mastra | - | @mastra/core | - |
Next Steps
- SDK Reference: TypeScript SDK · Python SDK
- Other languages: OTLP Exporter — connect from Go, Java, Ruby, .NET, or any OpenTelemetry-supported language
- Understand your data: Traces · Sessions
- Act on your data: Evaluations · Annotations