OrkaJS
Orka.JS

Prompt Templates

Build reusable, composable prompts with variables, few-shot examples, chat roles, and partial pre-filling. Structure your LLM interactions for consistent, high-quality outputs.

Why Prompt Templates?

Hard-coded prompts are brittle and hard to maintain. Prompt templates separate the structure from the data, enabling reuse, testing, and iteration. They also enforce variable validation, preventing runtime errors from missing inputs.

Without Templates

// Hard-coded, no validation, hard to reuse
const prompt = `You are a ${role}.
Translate "${text}" to ${language}.`;
 
// Missing variable? Silent bug!
// Changing format? Edit every prompt!

With Templates

const tmpl = PromptTemplate.fromTemplate(
'You are a {{role}}.\nTranslate "{{text}}" to {{language}}.'
);
 
// Validates all variables
const prompt = tmpl.format({
role: 'translator',
text: 'Hello',
language: 'French'
});

# PromptTemplate — Basic Templates

The foundation of prompt engineering in Orka AI. Define a template with {{variables}}, then format it with concrete values. Variables are validated at format time — missing variables throw clear errors.

import { PromptTemplate } from 'orkajs/templates/prompt';
 
// Method 1: fromTemplate (auto-detects variables)
const template = PromptTemplate.fromTemplate(
'Summarize the following {{format}} document about {{topic}}:\n\n{{content}}'
);
 
console.log(template.getInputVariables());
// ['format', 'topic', 'content']
 
const prompt = template.format({
format: 'technical',
topic: 'machine learning',
content: 'Deep learning is a subset of machine learning...'
});
 
// Result:
// "Summarize the following technical document about machine learning:
//
// Deep learning is a subset of machine learning..."
 
// Method 2: Constructor (explicit variables)
const template2 = new PromptTemplate({
template: 'Translate "{{text}}" from {{source}} to {{target}}.',
inputVariables: ['text', 'source', 'target'],
validateTemplate: true // Throws if variables don't match
});

# Partial Prompts — Pre-filled Variables

Partial prompts let you pre-fill some variables, creating a new template that only requires the remaining variables. This is powerful for creating specialized versions of generic templates.

import { PromptTemplate } from 'orkajs/templates/prompt';
 
// Generic translation template
const translationTemplate = PromptTemplate.fromTemplate(
'You are a professional {{language}} translator.\nTranslate the following text to {{language}}:\n\n{{text}}'
);
 
// Create a French-specific template (partially applied)
const frenchTranslator = translationTemplate.partial({
language: 'French'
});
 
console.log(frenchTranslator.getInputVariables());
// ['text'] — only 'text' remains!
 
const prompt = frenchTranslator.format({
text: 'Hello, how are you?'
});
// "You are a professional French translator.
// Translate the following text to French:
//
// Hello, how are you?"
 
// Dynamic partial variables (functions)
const timestampTemplate = PromptTemplate.fromTemplate(
'[{{timestamp}}] {{role}}: {{message}}'
);
 
const withTimestamp = timestampTemplate.partial({
timestamp: () => new Date().toISOString() // Evaluated at format time
});
 
const prompt2 = withTimestamp.format({
role: 'System',
message: 'Server started'
});
// "[2024-01-15T10:30:00.000Z] System: Server started"

💡 Partial Variables: String vs Function

  • StringStatic value, set once: { language: 'French' }
  • FunctionDynamic value, evaluated each time format() is called: { timestamp: () => Date.now().toString() }

# ChatPromptTemplate — Chat Messages with Roles

Modern LLMs use chat-based APIs with system, user, and assistant roles. ChatPromptTemplate structures your prompts as a sequence of role-based messages, each with its own template and variables.

import { ChatPromptTemplate } from 'orkajs/templates/chat';
 
// Create from message tuples [role, template]
const chatTemplate = ChatPromptTemplate.fromMessages([
['system', 'You are a {{role}} expert. Always respond in {{language}}.'],
['user', '{{question}}']
]);
 
console.log(chatTemplate.getInputVariables());
// ['role', 'language', 'question']
 
// Format returns ChatMessage[] — ready for LLM
const messages = chatTemplate.format({
role: 'TypeScript',
language: 'French',
question: 'What are generics?'
});
 
console.log(messages);
// [
// { role: 'system', content: 'You are a TypeScript expert. Always respond in French.' },
// { role: 'user', content: 'What are generics?' }
// ]
 
// Use directly with LLM
const result = await orka.getLLM().generate('', { messages });

Multi-turn Conversations

Build conversation templates with assistant responses for context-aware interactions.

const conversationTemplate = ChatPromptTemplate.fromMessages([
['system', 'You are a helpful coding assistant specializing in {{language}}.'],
['user', 'How do I create a {{dataStructure}} in {{language}}?'],
['assistant', 'Here is how to create a {{dataStructure}} in {{language}}:'],
['user', 'Now show me how to {{operation}} with it.']
]);
 
const messages = conversationTemplate.format({
language: 'TypeScript',
dataStructure: 'linked list',
operation: 'reverse'
});
 
// Creates a 4-message conversation with proper roles

# Partial Chat Templates

// Create a base template
const baseChat = ChatPromptTemplate.fromMessages([
['system', 'You are a {{role}} assistant. Respond in {{format}} format.'],
['user', '{{question}}']
]);
 
// Specialize for JSON responses
const jsonChat = baseChat.partial({
format: 'JSON'
});
 
// Now only needs 'role' and 'question'
const messages = jsonChat.format({
role: 'data analysis',
question: 'List the top 5 programming languages'
});

# FewShotPromptTemplate — Learning by Example

Few-shot prompting is one of the most effective techniques for guiding LLM behavior. By providing examples of input-output pairs, you teach the model the exact format and style you expect. FewShotPromptTemplate makes this structured and reusable.

import { FewShotPromptTemplate } from 'orkajs/templates/few-shot';
 
const fewShotTemplate = new FewShotPromptTemplate({
// Examples that teach the model
examples: [
{ input: 'happy', output: 'sad' },
{ input: 'tall', output: 'short' },
{ input: 'fast', output: 'slow' },
],
 
// How each example is formatted
examplePrompt: 'Input: {{input}}\nOutput: {{output}}',
 
// Text before examples
prefix: 'Give the antonym of every input word.',
 
// Text after examples (with the actual question)
suffix: 'Input: {{input}}\nOutput:',
 
// Variables the user must provide
inputVariables: ['input'],
 
// Separator between examples
exampleSeparator: '\n\n'
});
 
const prompt = fewShotTemplate.format({ input: 'bright' });

This generates:

Give the antonym of every input word.
 
Input: happy
Output: sad
 
Input: tall
Output: short
 
Input: fast
Output: slow
 
Input: bright
Output:

# Dynamic Examples

Add examples dynamically based on context or user behavior.

const template = new FewShotPromptTemplate({
examples: [], // Start empty
examplePrompt: 'Question: {{question}}\nSQL: {{sql}}',
prefix: 'Convert natural language to SQL queries.',
suffix: 'Question: {{question}}\nSQL:',
inputVariables: ['question']
});
 
// Add examples based on the database schema
template.addExamples([
{
question: 'How many users are there?',
sql: 'SELECT COUNT(*) FROM users;'
},
{
question: 'Find all active orders',
sql: "SELECT * FROM orders WHERE status = 'active';"
},
{
question: 'Get the top 10 products by revenue',
sql: 'SELECT name, SUM(price * quantity) as revenue FROM products GROUP BY name ORDER BY revenue DESC LIMIT 10;'
}
]);
 
const prompt = template.format({
question: 'What are the most popular categories?'
});

# Real-World Few-Shot Examples

🏷️ Classification

const classifier = new FewShotPromptTemplate({
examples: [
{ text: 'I love this product!', label: 'positive' },
{ text: 'Terrible experience', label: 'negative' },
{ text: 'It works fine', label: 'neutral' },
],
examplePrompt: 'Text: "{{text}}"\nSentiment: {{label}}',
prefix: 'Classify the sentiment of each text.',
suffix: 'Text: "{{text}}"\nSentiment:',
inputVariables: ['text']
});

📝 Data Extraction

const extractor = new FewShotPromptTemplate({
examples: [
{
text: 'John Smith, CEO of Acme Corp, announced...',
output: '{"name": "John Smith", "title": "CEO", "company": "Acme Corp"}'
},
{
text: 'Dr. Jane Doe from MIT published...',
output: '{"name": "Jane Doe", "title": "Dr.", "company": "MIT"}'
}
],
examplePrompt: 'Text: {{text}}\nJSON: {{output}}',
prefix: 'Extract person information as JSON.',
suffix: 'Text: {{text}}\nJSON:',
inputVariables: ['text']
});

# Complete Example — Combining Everything

templates-example.ts
import { createOrka, OpenAIAdapter } from 'orkajs/core';
import { ChatPromptTemplate } from 'orkajs/templates/chat';
import { FewShotPromptTemplate } from 'orkajs/templates/few-shot';
import { StructuredOutputParser } from 'orkajs/parsers/structured';
import { CachedLLM } from 'orkajs/cache/llm';
import { MemoryCache } from 'orkajs/cache/memory';
import { z } from 'zod';
 
// Setup
const llm = new OpenAIAdapter({ apiKey: process.env.OPENAI_API_KEY! });
const cachedLLM = new CachedLLM(llm, new MemoryCache());
 
const orka = createOrka({ llm: cachedLLM });
 
// 1. Define output schema
const schema = z.object({
sentiment: z.enum(['positive', 'negative', 'neutral']),
confidence: z.number().min(0).max(1),
keywords: z.array(z.string())
});
const parser = StructuredOutputParser.fromZodSchema(schema);
 
// 2. Create few-shot examples
const fewShot = new FewShotPromptTemplate({
examples: [
{
review: 'Amazing product, exceeded expectations!',
output: '{"sentiment": "positive", "confidence": 0.95, "keywords": ["amazing", "exceeded"]}'
},
{
review: 'Broke after 2 days, waste of money',
output: '{"sentiment": "negative", "confidence": 0.9, "keywords": ["broke", "waste"]}'
}
],
examplePrompt: 'Review: "{{review}}"\nAnalysis: {{output}}',
prefix: 'Analyze product reviews. Return JSON with sentiment, confidence, and keywords.',
suffix: 'Review: "{{review}}"\nAnalysis:',
inputVariables: ['review']
});
 
// 3. Create chat template
const chatTemplate = ChatPromptTemplate.fromMessages([
['system', 'You are a sentiment analysis expert. {{formatInstructions}}'],
['user', '{{fewShotPrompt}}']
]);
 
// 4. Format and call
const fewShotPrompt = fewShot.format({
review: 'Good quality but shipping was slow'
});
 
const messages = chatTemplate.format({
formatInstructions: parser.getFormatInstructions(),
fewShotPrompt: fewShotPrompt
});
 
const result = await orka.getLLM().generate('', { messages });
const analysis = parser.parse(result.content);
 
console.log(analysis);
// { sentiment: 'neutral', confidence: 0.7, keywords: ['good quality', 'slow'] }

Comparison

TemplateUse CaseOutput
PromptTemplateSimple text prompts with variablesstring
ChatPromptTemplateRole-based chat messagesChatMessage[]
FewShotPromptTemplateLearning by examplestring

vs Prompt Versioning

Orka AI has two complementary prompt systems. They work together:

Prompt Templates (this page)

Build-time composition: variables, few-shot, chat roles, partial application. For structuring how prompts are built.

Prompt Versioning (PromptRegistry)

Runtime management: versioning, rollback, diff, persistence. For managing prompt evolution over time.

Best Practices

1. Use Few-Shot for Format Control

When you need a specific output format (JSON, CSV, structured text), few-shot examples are more reliable than instructions alone.

2. Use Partials for Specialization

Create a generic template, then use .partial() to create specialized versions. This avoids duplication and ensures consistency.

3. Always Use System Messages

ChatPromptTemplate makes it easy to include system messages. Always set the role, tone, and constraints in the system message.

4. 3-5 Examples is Optimal

For few-shot prompting, 3-5 diverse examples usually give the best results. Too few and the model may not learn the pattern; too many wastes tokens.

Tree-shaking Imports

// ✅ Import only what you need
import { PromptTemplate } from 'orkajs/templates/prompt';
import { ChatPromptTemplate } from 'orkajs/templates/chat';
import { FewShotPromptTemplate } from 'orkajs/templates/few-shot';
 
// ✅ Or import from index
import { PromptTemplate, ChatPromptTemplate, FewShotPromptTemplate } from 'orkajs/templates';