PromptL v0.1.0

PromptL is a templating language specifically designed for LLM prompting. It provides a structured way to create, manage, and chain prompts with support for variables, control flow, and multi-modal content.

The language spec and compiler are built and maintained by Latitude, the open-source prompt engineering platform.

Overview

PromptL files compile into a message format that's compatible with any LLM provider API through different adapters. The compiler transforms your structured templates into an array of messages, each with a role and content, ready to be sent to models like GPT-4, Claude, or any other LLM.

Here's a simple example of a PromptL template:

---
model: gpt-4o
temperature: 0.6
---

You are a useful AI assistant expert in geography.

<user>
Hi! What's the capital of {{ country_name }}?
</user>

This template can be compiled to a JSON object that can be used to call any LLM API:

{
    "model": "gpt-4o",
    "temperature": 0.6,
    "messages": [
        {
            "type": "system",
            "content": "You are a useful AI assistant expert in geography."
        },
        {
            "type": "user",
            "content": "Hi! What's the capital of Spain?"
        }
    ]
}

This output makes it simple to integrate PromptL with your existing LLM infrastructure, regardless of which provider or model you're using. The compiler handles all the complexity of transforming your templates while preserving the semantic structure of your prompts.

Table of contents

Document execution

PromptL documents are executed sequentially from top to bottom, making it intuitive to build complex prompt pipelines. Each block, control flow statement, and variable assignment is processed in order, allowing you to create sophisticated workflows while maintaining readability.

/* Example of sequential execution */
<step as='analysis'>
    <system>You are a feedback analyzer</system>
    <user>
        Analyze this feedback: {{ feedback }}
        Return your analysis in this JSON format:
        {
            "sentiment": "positive" or "negative",
            "main_points": ["point 1", "point 2", ...]
        }
    </user>
</step>

{{ if analysis.sentiment == 'negative' }}
    <step as='response'>
        <user>
            Generate a careful response addressing these concerns:
            {{ analysis.main_points }}
        </user>
    </step>
{{ else }}
    <step as='response'>
        <user>
            Write a thank you note highlighting these positive points:
            {{ analysis.main_points }}
        </user>
    </step>
{{ endif }}

This sequential execution model, combined with features like variable scoping, step isolation, and control flow, enables you to:

The predictable execution flow makes it easy to reason about your prompts and debug complex interactions, while the structured nature of the language helps prevent common prompting pitfalls.

Core syntax

Learn the fundamental building blocks and syntax of PromptL to create well-structured and maintainable prompt templates.

Variable interpolation and assignment

Insert dynamic values into your prompts and manage state throughout your templates. Variables can be assigned, used with default values, and modified using helper functions.

/* Basic interpolation */
{{ user_input }}

/* Assignment examples */
{{ is_premium_member = true }}

/* Default values */
{{ system_prompt || 'You are a helpful AI assistant' }}

Role-based blocks

Structure your conversations with clear role definitions. Each block represents a different participant in the conversation, making it easy to maintain context and flow in complex interactions.

<user>
    Write a blog post about {{ topic }}. Use a {{ tone }} tone.
</user>

<system>
    You are an expert content writer with deep knowledge of {{ industry }}.
    Always write in a clear, engaging style.
</system>

<assistant>
    Here's a draft blog post about {{ topic }}...
</assistant>

<message role='example'>
    Here's an example of high-quality content in this style...
</message>

Steps and chaining

Break down complex prompts into manageable steps. Each step can be executed independently or as part of a chain, with fine-grained control over context flow and model selection.

<step as='researchPhase' provider='OpenAI' model='gpt-4'>
    <user>
        Research key points about {{ topic }} and create an outline.
    </user>
</step>

<step as='writing' isolated='true' temperature='0.7'>
    <user>
        Using this outline: {{ researchPhase.outline }}
        Write a detailed article.
    </user>
</step>

Control flow

Add logic to your templates with conditionals and loops. Create dynamic prompts that adapt based on variables and iterate over collections of data.

/* Conditionals */
{{ if audience == 'technical' }}
    Provide a detailed analysis including architecture patterns, performance considerations, and implementation specifics
{{ else if audience == 'business' }}
    Focus on business value, ROI, and high-level strategic implications
{{ else }}
    Give a balanced overview covering key points without technical jargon
{{ endif }}

/* Loops with index */
{{ for task, index in tasks }}
    {{ index + 1 }}. Complete this task: {{ task }}
{{ else }}
    No tasks provided
{{ endfor }}

Content types

Support for multi-modal conversations with structured content blocks. Easily mix text and images while maintaining clear separation between different types of content.

<content type='text'>
    Analyze this code snippet and suggest improvements
</content>

/* Shorthand for text */
<content-text>
    Describe what you see in the following image
</content-text>

<content-image>
    screenshots/code-review.png
</content-image>

Partials and reusability

Break down complex prompts into reusable components. Partials can accept parameters and be included in other templates, promoting code reuse and maintainability.

<prompt path='prompts/code-review' language='python' style='detailed'/>

Error handling and raw content

Safely handle undefined values and include literal content when needed. The raw block allows you to include template syntax as literal text without processing.

/* Safe access to nested properties */
{{ if response?.suggestions }}
    Consider these improvements: {{ response.suggestions }}
{{ endif }}

/* Raw content blocks */
{{ raw }}
    Here's how to use variables: {{ variable_name }}
{{ endraw }}

Additional features

Implementations

PromptL provides an official implementation in TypeScript/JavaScript, with support for more languages coming soon:

Contributing

We welcome contributions from the community! Whether you're fixing bugs, improving documentation, or proposing new features, here's how you can help:

For any questions or support, you can also reach out to us through: