Overview
The structure of a PromptL prompt is designed for clarity and flexibility, making it easy to define both global configurations and the messages that drive your LLM conversations. A PromptL prompt is divided into two main sections:- The Config Section, where you define global options for the prompt.
- The Messages, which define the conversational flow between the user, assistant, and other roles.
Config Section
The config section is an optional part of a PromptL prompt, defined at the very beginning. It allows you to specify global settings for your LLM, such as themodel
, temperature
, or any other configuration supported by your provider.
This section is enclosed between triple dashes (---
) and uses YAML format for key-value pairs:
Messages
The messages section defines the conversational flow of your prompt. Messages are structured in a chat-based format and can represent one of the following types:- System: Sets the context or rules for the conversation.
- User: Represents messages from the user to the assistant.
- Assistant: Represents messages from the assistant to the user.
- Tool: Used for interactions with external tools or APIs.
- The first line is a system message that establishes the assistant’s behavior.
- The
<user>
block defines a user message.