- 🎯 Jinja-Powered Templates: Leverage the full power of Jinja2 templating for dynamic prompt generation
- 🎭 Role-Based Messaging: Structure conversations with system, user, assistant, and tool roles
- 🤖 Pydantic-AI Compatible: Returns structured pydantic-ai ModelMessage objects for seamless integration
- 🔄 Format Conversion: Convert to OpenAI chat API (or other APIs) through pydantic-ai's message mapping
pip install prompt-bottle
from prompt_bottle import render
# Simple template
template = """
You are a helpful assistant.
<div role="user">{{ user_message }}</div>
"""
messages = render(template, user_message="Hello, world!")
print(messages)
You are a helpful assistant.
{{ system_instructions }}
{% for item in conversation %}
<div role="user">{{ item.message }}</div>
<div role="assistant">
<think>{{ item.reasoning }}</think>
{{ item.response }}
</div>
{% endfor %}
<div role="user">
<Instruction>
Now you are required to answer the query based on the context.
Your output must be quoted in <Answer></Answer> tags.
</Instruction>
<Context>
{{ context }}
</Context>
<Query>
{{ query }}
</Query>
</div>
💡 Check out example.py
and example.jinja
for a comprehensive example with all features!
Warning
You can use any HTML-like tags in your prompt, other than the reserved tags. However, all tags must be properly closed (e.g., <instruct> content </instruct>
not <instruct> content
) to avoid parsing errors.
Prompt Bottle supports four main roles:
system
: System instructions and configuration. Default of raw text.user
: User messages and queriesassistant
: AI assistant responsestool
: Tool execution results
Assistant responses can include multiple content tags:
<text>
: Regular text content. Default of raw text.<tool_call>
: Function/tool invocations<think>
: Reasoning and thought processes
TODO: Will support multimodal input in the future.
Renders a Jinja template with the provided variables and returns structured messages.
Parameters:
template
: Jinja template string**kwargs
: Template variables
Returns: List of pydantic-ai ModelMessage objects (ModelRequest/ModelResponse)
Converts structured messages to OpenAI chat completion format.
Parameters:
messages
: List of ModelMessage objects from render()**model_kwargs
: OpenAI model configuration (default model: gpt-4o)
Returns: List of OpenAI-formatted message dictionaries
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
velin: Vue.js based prompt template engine