TypeScript
Integrate Latitude into your Node.js applications using the TypeScript SDK.
The Latitude TypeScript SDK provides a convenient way to interact with the Latitude platform from your Node.js or browser applications.
Installation
The Latitude SDK is compatible with Node.js 16 or higher.
Authentication and Initialization
Import the SDK and initialize it with your API key. You can generate API keys in your Latitude project settings under “API Access”.
You can also provide additional options during initialization:
Keep your API key secure and avoid committing it directly into your codebase.
Examples
Check out our cookbook for more examples of how to use the Latitude SDK.
SDK Structure
The Latitude SDK is organized into several namespaces:
prompts
: Methods for managing and running promptslogs
: Methods for creating and managing logsevaluations
: Methods for triggering evaluations and creating results
Prompt Management
Get a Prompt
To retrieve a specific prompt by its path:
Get All Prompts
To retrieve all prompts in your project:
Get or Create a Prompt
To get an existing prompt or create a new one if it doesn’t exist:
You can also provide the content when creating a new prompt:
Running Prompts
Non-Streaming Run
Execute a prompt and get the complete response once generation is finished:
If your prompt is an agent, an agentResponse
property will be defined in the
result. The structure of the response will depend on the agent’s
configuration, although by default it will be: { "response": "Your agent's response" }
.
Handling Streaming Responses
For real-time applications (like chatbots), use streaming to get response chunks as they are generated:
Using Tools with Prompts
You can provide tool handlers that the model can call during execution:
If you need to pause the execution of a tool, you can do so by returning
details.pauseExecution()
in the tool handler. You can resume the
conversation later by returning the tool results in the
latitude.prompts.chat
method.
Chat with a Prompt
Follow the conversation of a runned prompt:
Messages follow the PromptL format. If you’re using a different method to run your prompts, you’ll need to format your messages accordingly.
Rendering Prompts
Prompt Rendering
Render a prompt locally without running it:
Chain Rendering
Render a chain of prompts locally:
Agent Rendering
Render an agent prompt locally (similar to renderChain
but with a final agent result):
Make sure to provide the config.tools
parameter to the LLM provider in your
onStep
handler, otherwise the AI won’t be able to stop the Agent loop!
Logging
Creating Logs
Push a log to Latitude manually for a prompt:
Evaluations
Triggering Evaluations
Trigger an evaluation manually for a conversation:
Creating Evaluation Results
Push a result to Latitude manually for an evaluation:
Complete Method Reference
Initialization
Prompts Namespace
Logs Namespace
Evaluations Namespace
Error Handling
The SDK throws LatitudeApiError
instances when API requests fail. You can catch and handle these errors:
Logging Features
- Automatic Logging: All runs through
latitude.prompts.run()
are automatically logged in Latitude, capturing inputs, outputs, performance metrics, and trace information. - Custom Identifiers: Use the optional
customIdentifier
parameter to tag runs for easier filtering and analysis in the Latitude dashboard. - Response Identification: Each response includes identifying information like
uuid
that can be used to reference the specific run later.