Using Jinja2 to Build Smarter Prompts for LLMs

Jinja2 allows you to create flexible prompt templates that can adapt based on user input. Consider the following Python script that defines a Jinja2 template for an LLM prompt: from jinja2 import Template prompt_template = """ {% if context %} Context: {{ context }} Based on the context provided above, answer the following question: {% else %} Answer the following question: {% endif %} Question: {{ question | default("Default question") }} Give a detailed and well-explained answer. """ template = Template(prompt_template, trim_blocks=True, lstrip_blocks=True) # Provide question and context prompt = template.render(question="What is Python?", context="Python is a programming language.").strip() print(prompt) # Provide only question prompt = template.render(question="What is AI?").strip() print(prompt) # Provide nothing prompt = template.render().strip() print(prompt) Here’s what this template does: If a context is provided, it includes it in the prompt. Context: Python is a programming language. Based on the context provided above, answer the following question: Question: What is Python? Give a detailed and well-explained answer. If no context is provided, it directly asks the question. Answer the following question: Question: What is AI? Give a detailed and well-explained answer. The default() filter ensures a fallback question is used if no question is provided(just for demonstration, may be handy for other scenarios) Answer the following question: Question: Default question Give a detailed and well-explained answer. Yes, we have frameworks and some of them use jinja2 but I prefer to keep things under my control and build from scratch if possible. If you have an interesting project, let's connect! https://www.linkedin.com/in/mayankladdha31/

Mar 6, 2025 - 21:09
 0
Using Jinja2 to Build Smarter Prompts for LLMs

Jinja2 allows you to create flexible prompt templates that can adapt based on user input.
Consider the following Python script that defines a Jinja2 template for an LLM prompt:

from jinja2 import Template

prompt_template =  """
{% if context %}
Context: {{ context }}
Based on the context provided above, answer the following question:
{% else %}
Answer the following question:
{% endif %}
Question: {{ question | default("Default question") }}
Give a detailed and well-explained answer.
"""

template = Template(prompt_template, trim_blocks=True, lstrip_blocks=True)

# Provide question and context
prompt = template.render(question="What is Python?", context="Python is a programming language.").strip()
print(prompt)

# Provide only question
prompt = template.render(question="What is AI?").strip()
print(prompt)

# Provide nothing
prompt = template.render().strip()
print(prompt)

Here’s what this template does:

If a context is provided, it includes it in the prompt.

Context: Python is a programming language.
Based on the context provided above, answer the following question:
Question: What is Python?
Give a detailed and well-explained answer.

If no context is provided, it directly asks the question.

Answer the following question:
Question: What is AI?
Give a detailed and well-explained answer.

The default() filter ensures a fallback question is used if no question is provided(just for demonstration, may be handy for other scenarios)

Answer the following question:
Question: Default question
Give a detailed and well-explained answer.

Yes, we have frameworks and some of them use jinja2 but I prefer to keep things under my control and build from scratch if possible.

If you have an interesting project, let's connect!
https://www.linkedin.com/in/mayankladdha31/