Overview
While most LLM providers follow a similar chat-like structure, there are subtle differences in how prompts are formatted and processed. PromptL addresses these differences by providing Adapters for each major provider, ensuring that your prompts are correctly formatted and seamlessly integrated. Currently, PromptL supports: More providers will be supported in the future, and you can even create your own custom adapters for unsupported platforms.Why Adapters?
Adapters handle provider-specific definition differences, such as:- Message structure: OpenAI uses
role
-based messages, while Anthropic uses auser/assistant
prefix. - API integration: Adapters ensure compatibility with the provider’s API.
Getting Started with Adapters
Here’s how to use an adapter in your project. For this example, we’ll use OpenAI:Supported Providers
OpenAI (default)
The OpenAI Adapter, which is selected by default, formats prompts to match OpenAI’s chat-completion API, including support for models likegpt-4
and gpt-3.5
.
Anthropic
The Anthropic Adapter ensures compatibility with Anthropic’s API.Additional providers will be supported in the future. Check back for updates!
Extending Adapters
If you’re working with an unsupported provider, you can create your own adapter. Adapters are simple functions that transform PromptL’smessages
and config
into the format required by your provider.
An adapter is defined as an object with two functions: { fromPromptl, toPromptl }
.
Each function takes an object with messages
and config
properties and returns the same object with transformed data.
To see the structure of messages
used in PromptL, check out the GitHub PromptL Repository
Example: Custom Adapter
render
function: