Prompt Engineering Basics
Prompt
A text-based instruction sent to a language model to guide its response. In Latitude, prompts can include logic, such as variables, conditionals, loops, and other advanced features using PromptL syntax.PromptL
Latitude’s prompt templating language that enables dynamic prompt construction with variables, conditions, loops, and other programming concepts.Language Model (LLM)
The AI system that generates responses based on prompts. Examples include GPT-4, Claude, Gemini, among others.Provider
The company or service that offers access to language models, such as OpenAI, Anthropic, Google, or Azure.Latitude Platform Components
Prompt Manager
The core interface for creating, editing, and managing prompts in Latitude. Includes the Prompt Editor, configuration settings, and version control.Playground
An interactive testing environment where you can run prompts with different inputs and configurations to see how they perform.AI Gateway
The deployment layer that exposes prompts as API endpoints, making them available for integration with applications. These are the addresses through which you will access your agent.Evaluations
Tools for assessing prompt performance using different methodologies:- LLM-as-Judge: Using AI to evaluate outputs
- Programmatic Rules: Using code-based criteria
- Manual Evaluations: Human review of outputs
Logs
Records of all interactions between prompts and language models, including inputs, outputs, metadata, and performance metrics.Datasets
Collections of input/output pairs used for testing, evaluations, and regression testing.Advanced Features
Tools
Functions that prompts can call to access external capabilities, such as retrieving information, performing calculations, or taking actions.Agents
Advanced prompts that can make decisions, use tools, and solve complex problems through multiple interaction steps.JSON Mode
A configuration setting that enforces structured output formats through JSON schemas.Cache
A mechanism that stores previously generated responses to improve performance and reduce costs.Collaboration & Deployment
Version Control
Features for tracking changes to prompts, comparing versions, and managing the prompt lifecycle from draft to production.Environments
Different contexts (e.g., development, staging, production) for deploying and testing prompts.Telemetry
Automatic capture of metrics and performance data from prompt interactions.Webhooks
Integration points that trigger actions in external systems when certain events occur in Latitude.Development Concepts
SDK
Software Development Kits (available for TypeScript, Python, and other languages) are toolkits that allow for interaction with Latitude through your own computer programs.API
The Latitude HTTP API (Application Programming Interface) that provides access to platform features for custom integrations.Self-Hosting
Running Latitude on your own servers instead of using the managed cloud version.Building Blocks
Prompt Template
The base structure of a prompt that includes placeholders for variables.Parameter
A value you provide in the prompt template that tells it what data to use or how to behave.Snippet
Reusable prompt fragment that can be shared across multiple prompts.System Message
Special instructions to the model that set context and behavior expectations.User Message
Content presented as coming from a user in a conversation with the model.Assistant Message
Content presented as previously generated by the model in a conversation.Next Steps
Now that you’re familiar with the core concepts, you can:- Follow the Quick Start for Product Managers (No-Code)
- Follow the Quick Start for AI Engineers (Coding)
- Learn about our Prompt Manager in more detail