Latitude HTTP API Documentation
This guide explains how to use the Latitude HTTP API to interact with the Prompt Manager and run AI-powered conversations.Authentication
All API requests require authentication. Include your API key in theAuthorization header of your HTTP requests:
Base URL
The base URL for API requests depends on your environment:https://gateway.latitude.so/api/v3
Rate Limiting
The API enforces rate limits based on your API key to ensure fair usage and prevent abuse. Limits: Rate limits are enforced based on your subscription plan. The following limits apply:-
Hobby Plan:
- 10 requests per second
-
Team Plan:
- 166 requests per second (10000 requests per minute)
-
Enterprise Plan:
- 500 requests per second (30000 requests per minute)
Retry-After: Indicates the number of seconds to wait before making a new request.X-RateLimit-Limit: The maximum number of requests allowed in the current period.X-RateLimit-Remaining: The number of requests remaining in the current period.X-RateLimit-Reset: The timestamp when the rate limit will reset.
Endpoints
1. Get a Prompt
Retrieve a specific prompt by its path. Use this endpoint to fetch the content and configuration of an existing prompt in your project. Endpoint:GET /projects/{projectId}/versions/{versionUuid}/documents/{path}
Path Parameters:
projectId: Your project ID (required)versionUuid: Version UUID (required, optional for SDK’s defaults to ‘live’)path: Path to the document (required)
2. Get or Create a Prompt
Retrieve an existing prompt or create it if it doesn’t exist. This endpoint provides an idempotent way to ensure a prompt exists at a specific path without checking first. Endpoint:POST /projects/{projectId}/versions/{versionUuid}/documents/get-or-create
Path Parameters:
projectId: Your project ID (required)versionUuid: Version UUID (required, optional for SDK’s defaults to ‘live’)
path: Path to the prompt (required)prompt: Prompt content to use (optional, defaults to empty)
3. Create or Update a Prompt
Create a new prompt or update an existing one in a single operation. This endpoint provides more control thanget-or-create, including the ability to update live commits with the force flag.
Endpoint: POST /projects/{projectId}/versions/{versionUuid}/documents/create-or-update
Path Parameters:
projectId: Your project ID (required)versionUuid: Version UUID (required, optional for SDK’s defaults to ‘live’)
path: Path to the prompt (required)prompt: Content of the prompt (required)force: Allow modifications to live/merged commits (optional, defaults tofalse)
- If the prompt does not exist at the specified path, it will be created
- If the prompt already exists at the path, it will be updated with the new content
- By default, modifications are only allowed on draft commits (not live/merged)
- When
force: true, allows creating or updating prompts in live commits (use with caution)
force flag, the API returns a 400 status code:
- Single API call for upsert operations: No need to check if a prompt exists before creating/updating
- Programmatic prompt updates: Update prompts from your CI/CD pipeline or automation scripts
- Emergency hotfixes: Use
force: trueto quickly fix production prompts when needed - Batch operations: Efficiently create or update multiple prompts in a loop
4. Create a Version (Commit)
Create a new draft version (commit) for a project. Versions allow you to manage changes to your prompts before publishing them to production. Endpoint:POST /projects/{projectId}/versions
Path Parameters:
projectId: Your project ID (required)
name: Name/title for the new version (required)
- Create draft versions: Start working on prompt changes in isolation
- Version control: Track different iterations of your prompts
- CI/CD integration: Programmatically create versions from your deployment pipeline
5. Publish a Version (Commit)
Publish a draft version (commit) to make it the live/production version. This merges the draft changes and assigns it a version number. Endpoint:POST /projects/{projectId}/versions/{versionUuid}/publish
Path Parameters:
projectId: Your project ID (required)versionUuid: UUID of the draft version to publish (required)
title: Optional title for the published version (if not provided, uses existing title)description: Optional description or release notes for the published version
- Deploy to production: Publish tested prompt changes to make them live
- Release management: Track which version is currently in production
- Automated deployments: Publish versions from CI/CD pipelines after successful tests
6. Run a Prompt
Execute a prompt with optional parameters. This endpoint processes your prompt template, sends it to the configured AI provider, and returns the generated response. Supports both streaming and non-streaming modes, as well as background processing for long-running operations. Endpoint:POST /projects/{projectId}/versions/{versionUuid}/documents/run
Path Parameters:
projectId: Your project ID (required)versionUuid: Version UUID (required, optional for SDK’s defaults to ‘live’)
stream: Optional boolean parameter (defaults tofalse). When set to true, the response will be a stream of Server-Sent Events (SSE). If false, a single JSON response containing the last event is returned.background: Optional boolean parameter (defaults tofalse). When set to true, the request is enqueued for background processing and returns immediately with a conversation UUID.userMessage: Optional string to start the conversation with a user message.customIdentifier: Optional string for custom identification of the run.tools: Optional array of tool names to enable for this run.
- If
backgroundistrue: Returns immediately with a conversation UUID for background processing:
-
If
streamistrue: The response is a stream of Server-Sent Events (SSE). Check out the Streaming Events guide for more information about the specific events you can expect. -
If
streamisfalse: A single JSON response is returned with the final event (typically the chain-complete event) in the following structure:
Message follows the PromptL format.
ToolCall has the following format:
7. Chat
Continue a multi-turn conversation by sending additional messages to an existing conversation thread. This endpoint allows you to maintain context across multiple exchanges with the AI model by building upon messages from a previous run. The conversation history is automatically managed, and each new message is appended to the existing message chain. Endpoint:POST /conversations/{conversationUuid}/chat
Path Parameters:
conversationUuid: UUID of the conversation
- Messages follow the PromptL format. If you’re using a different method to run your prompts, you’ll need to format your messages accordingly.
stream: Optional boolean parameter (defaults tofalse). When set to true, the response will be a stream of Server-Sent Events (SSE). If false, a single JSON response containing the last event is returned. Check out the Streaming Events guide for more information about the specific events you can expect.
Message follows the PromptL format.
8. Get a Conversation
Retrieve the conversation history by its UUID. Use this endpoint to fetch the complete message history from a completed conversation, including all user messages, assistant responses, and tool calls. Endpoint:GET /conversations/{conversationUuid}
Path Parameters:
conversationUuid: UUID of the conversation (required)
Message follows the PromptL format.
9. Stop a Conversation
Stop an active run that is currently processing. This is useful when you need to cancel a long-running prompt execution, such as when the output is no longer needed or when you want to prevent further token consumption. Endpoint:POST /conversations/{conversationUuid}/stop
Path Parameters:
conversationUuid: UUID of the conversation
10. Attach to a Conversation
Attach to an active run to receive its output events. This endpoint is particularly useful when you’ve started a prompt execution withbackground: true and want to stream the results. You can attach at any point during the run’s execution to receive the remaining events.
Endpoint: POST /conversations/{conversationUuid}/attach
Path Parameters:
conversationUuid: UUID of the conversation
stream: Optional boolean parameter (defaults tofalse). When set to true, the response will be a stream of Server-Sent Events (SSE). If false, a single JSON response containing the final event is returned.
-
If
streamistrue: The response is a stream of Server-Sent Events (SSE). Check out the Streaming Events guide for more information about the specific events you can expect. -
If
streamisfalse: A single JSON response is returned with the final event in the following structure:
11. Annotate a Log
Add a manual evaluation score to a conversation log. Use this endpoint to provide human feedback or manual assessments of prompt outputs, which can be used for quality tracking and model improvement. Endpoint:POST /conversations/{conversationUuid}/evaluations/{evaluationUuid}/annotate
Path Parameters:
conversationUuid: UUID of the conversation to annotateevaluationUuid: UUID of the evaluation to use
12. Create Log Entry
Create a log entry for a prompt without executing it. This endpoint allows you to record prompt executions that happened outside of Latitude (e.g., direct LLM API calls) for tracking, analytics, and evaluation purposes. Endpoint:POST /projects/{projectId}/versions/{versionUuid}/documents/logs
Path Parameters:
projectId: Your project ID (required)versionUuid: Version UUID (required, optional for SDK’s defaults to ‘live’)
- Messages follow the PromptL format. If you’re using a different method to run your prompts, you’ll need to format your messages accordingly.