Core Concepts
Key terminology and concepts you need to know when using Latitude
This glossary explains the fundamental concepts and terminology used throughout Latitude to help you better understand the platform.
Prompt Engineering Basics
Prompt
A text-based instruction sent to a language model to guide its response. In Latitude, prompts can include variables, conditionals, loops, and other advanced features through PromptL syntax.
PromptL
Latitude’s prompt templating language that enables dynamic prompt construction with variables, conditions, loops, and other programming concepts.
Language Model (LLM)
The AI system that generates responses based on prompts. Examples include GPT-4, Claude, Gemini, and others.
Provider
The company or service that offers access to language models, such as OpenAI, Anthropic, Google, or Azure.
Latitude Platform Components
Prompt Manager
The core interface for creating, editing, and managing prompts in Latitude. Includes the Prompt Editor, configuration settings, and version control.
Playground
An interactive testing environment where you can run prompts with different inputs and configurations to see how they perform.
AI Gateway
The deployment layer that exposes prompts as API endpoints, making them available for integration with applications.
Evaluations
Tools for assessing prompt performance using different methodologies:
- LLM-as-Judge: Using AI to evaluate outputs
- Programmatic Rules: Using code-based criteria
- Manual Evaluations: Human review of outputs
Logs
Records of all interactions between prompts and language models, including inputs, outputs, metadata, and performance metrics.
Datasets
Collections of input/output pairs used for testing, evaluations, and regression testing.
Advanced Features
Tools
Functions that prompts can call to access external capabilities, such as retrieving information, performing calculations, or taking actions.
Agents
Advanced prompts that can make decisions, use tools, and solve complex problems through multiple interaction steps.
JSON Mode
A configuration setting that enforces structured output formats through JSON schemas.
Cache
A mechanism that stores previously generated responses to improve performance and reduce costs.
Collaboration & Deployment
Version Control
Features for tracking changes to prompts, comparing versions, and managing the prompt lifecycle from draft to production.
Environments
Different contexts (e.g., development, staging, production) for deploying and testing prompts.
Telemetry
Automatic capture of metrics and performance data from prompt interactions.
Webhooks
Integration points that trigger actions in external systems when certain events occur in Latitude.
Development Concepts
SDK
Software Development Kits (available for TypeScript, Python, and other languages) that allow programmatic interaction with Latitude.
API
The Latitude HTTP API that provides access to platform features for custom integrations.
Self-Hosting
Running Latitude on your own infrastructure instead of using the managed cloud version.
Building Blocks
Prompt Template
The base structure of a prompt that includes placeholders for variables.
Snippets
Reusable prompt fragments that can be shared across multiple prompts.
System Message
Special instructions to the model that set context and expectations for behavior.
User Message
Content presented as coming from a user in a conversation with the model.
Assistant Message
Content presented as previously generated by the model in a conversation.
Next Steps
Now that you’re familiar with the core concepts, you can:
- Follow the Quick Start for Product Managers (No-Code)
- Follow the Quick Start for AI Engineers (Coding)
- Learn about our Prompt Manager in more detail