Typescript
Integrate Latitude’s SDK into your Typescript project
Latitude’s Typescript integration has the following main features:
- Automatic tracing of LLM calls
- Interact with Latitude’s prompt manager from code: create, update and delete prompts
- Render Latitude prompts locally and run them against your LLM providers
- Run prompts with Latitude’s high-performing gateway
- Trigger LLM as judge and human in the loop evaluations
- Programmatically push external logs to Latitude for evaluation and monitoring
Installation
To install the Latitude SDK, use your preferred package manager:
Getting Started
First, import the Latitude class from the SDK and initialize it with your API key:
Examples
Check out our cookbook for more examples of how to use Latitude’s SDK.
Telemetry
Latitude can automatically trace all your LLM calls from most major providers and frameworks using our OpenTelemetry integration. We recommend this approach to easily get started using Latitude’s full capabilities.
Here’s how to integrate with the all supported providers/frameworks:
Learn more about traces and how to monitor them with Latitude.
A note during development
Latitude’s OpenTelemetry integration batches requests automatically in order to
improve performance. This is helpful in production workloads, but during
development you may want to disable batching. This can be done by setting the
disableBatch
option to true
:
Prompt Management
Get or create a prompt
To get or create a prompt, use the getOrCreate
method:
Run a prompt with your LLM provider
The render
method will render your prompt and return the configuration and
messages to use with your LLM provider. This render step is completely local and
does not use Latitude’s runtime services.
Here’s an example of running a Latitude prompt locally using the OpenAI package:
You can also execute chains by providing an onStep
callback to the
renderChain
method, which will be run for each step of the chain to generate
the response. Here’s an example:
render
and renderChain
only work with the latest iteration of Latitude’s
open source prompt syntax: PromptL
Run a prompt through Latitude Gateway
Latitude’s Gateway is a high-performing gateway that proxies your LLM calls between your application and the LLM provider. It includes some additional features like automatic prompt caching based on content and prompt configuration.
In order to run a prompt through Latitude’s Gateway, use the run
method:
Log Management
Pushing a log to Latitude
To create a log programmatically, use the create
method:
Message follows OpenAI’s format.