Learn how to install and use PromptL in your project
PromptL simplifies the process of creating and managing prompts for large language models (LLMs). This quick start guide will show you how to set up PromptL in your project and generate dynamic prompts with minimal effort.
Prerequisites: Ensure you have Node.js installed and access to an LLM provider like OpenAI or Anthropic.
Install PromptL via npm:
You’ll also need the library for your LLM provider.
Here’s how to use PromptL to generate a dynamic prompt and interact with an LLM:
Different providers will require a different setup and structure. Check out the Adapters section for more information on how to integrate with your provider.
prompt
variable defines the PromptL prompt, including configuration and template syntax.parameters
object passes the value topic: 'chickens'
to replace {{ topic }}
in the prompt.render
function processes the prompt and generates the messages
array and config
object for your LLM provider.messages
and config
to the model, generating a response.For production environments, add error handling to manage unexpected issues:
Once you’ve set up PromptL, explore its advanced features:
PromptL makes it easy to create and manage dynamic prompts for LLMs. By following this guide, you’ve set up PromptL, generated a dynamic prompt, and integrated it with an LLM provider. Now, you’re ready to explore its full potential.