Open Source Prompt Infrastructure

Version control
for AI prompts

Ship prompt changes without code deploys. Manage versions, promote across environments, and resolve the right prompt at runtime — all through a type-safe SDK.

promptops-demo.ts

Prompts in your codebase = chaos

Every prompt change requires a code review, a deploy, and a prayer. PromptOps decouples your prompts from your code.

😩 Without PromptOps

  • Prompts hardcoded in source files
  • Version → PR → Review → Deploy cycle
  • No rollback without code revert
  • Same prompt in dev and production
  • No audit trail of changes

✨ With PromptOps

  • Prompts managed via API + dashboard
  • Version → Promote → Live in seconds
  • Instant rollback, any environment
  • dev → staging → production pipeline
  • Full version history with metadata

Three lines to production prompts

Install the SDK. Fetch your prompt. Render with variables. That's it.

TypeScriptapp/generate.ts
import { PromptOps } from '@promptops/sdk'

// Initialize with your API key
const promptOps = new PromptOps({
  apiKey: process.env.PROMPTOPS_API_KEY,
  baseUrl: 'https://api.promptops.dev',
})

// Fetch the active prompt for your environment
const prompt = await promptOps.getPrompt('support-classifier', {
  environment: 'production',  // or 'dev', 'staging'
})

// Render with variables
const message = promptOps.render(prompt, {
  ticketContent: userMessage,
  customerTier: 'enterprise',
})

// Use with any LLM
const response = await openai.chat.completions.create({
  model: prompt.model,             // "gpt-4" (from PromptOps)
  temperature: prompt.temperature, // 0.3 (from PromptOps)
  messages: [
    { role: 'system', content: prompt.systemPrompt },
    { role: 'user', content: message },
  ],
})
1
Zero dependencies

Native fetch, no bloat. Works in Node.js 18+, Deno, Bun, and edge runtimes.

2
Environment-aware

Fetch different prompt versions for dev, staging, and production automatically.

3
Built-in resilience

Local caching with TTL + stale fallback. Your app works even if PromptOps is down.

How PromptOps works

01

Create

Register a prompt with a human-readable slug via the API or dashboard.

POST /api/v1/prompts
02

Version

Add new versions with system prompts, templates, model configs, and metadata.

POST /prompts/:id/versions
03

Deploy

Promote versions to dev, staging, or production. Rollback instantly if needed.

POST /prompts/:id/promote
04

Resolve

Your SDK fetches the active version at runtime. Cached, resilient, fast.

sdk.getPrompt("slug")

Built for production

Everything you need to manage prompts at scale.

Version Control

Every prompt change creates an immutable version. Full history, full auditability.

Environment Pipeline

dev → staging → production promotion flow. Test before you ship to users.

Type-Safe SDK

TypeScript-first with full type inference. Zero runtime dependencies.

Instant Rollback

One API call to revert any environment to its previous version. No deploys.

API-First

RESTful API with every operation. Dashboard is optional — automate everything.

Resilient Cache

In-memory cache with TTL and stale fallback. Works even when the API is unreachable.

Built for builders

🧑‍💻

AI Engineers

Ship prompt iterations in seconds instead of waiting for code deploys. Test in dev, promote to production with confidence.

promptOps.getPrompt("email-gen", { env: "staging" })
🏗️

Platform Teams

Give product and ML teams control over prompts without touching the codebase. API keys scoped per environment.

POST /api/v1/prompts/onboarding/promote
🚀

Solo Founders

Start iterating on your AI product's prompts without building prompt infrastructure from scratch.

npm install @promptops/sdk

From the Blog

Deep dives into prompt infrastructure, AI DevOps, and the future of LLM engineering.

📖

The Guide to Prompt Version Control

Why hardcoding prompts is technical debt and how to build a prompt CI/CD pipeline.

Read Guide →
💀

Prompt Engineering is Dead. Long Live PromptOps.

The era of the "Prompt Whisperer" is over. The AI Engineer has arrived.

Read Article →
🔄

The Missing Manual for LLM CI/CD

Your code has pipelines. Your prompts should too. Build automated testing and deployment.

Read Guide →
🚫

Stop Hardcoding Prompts

The $3K–$24K/month mistake hiding in your codebase. Five dangers and the fix.

Read Article →
🧪

From "Vibe Checks" to Unit Tests

Your code has 95% coverage. Your prompts have 0%. Three types of tests to fix it.

Read Article →
⚗️

How to A/B Test Prompts Safely

Run prompt experiments in production without risking your users. Data over gut feelings.

Read Guide →
🧠

Context Engineering > Prompt Engineering

It's not what you ask — it's what the AI knows when you ask. The 90% below the surface.

Read Article →
🎯

Orchestrating Agents: System Prompt as Manager

How to write manager prompts that coordinate worker agents in multi-agent systems.

Read Guide →
🏛️

AI Governance for Enterprise

Access control, audit trails, and compliance alignment for production LLM deployments.

Read Guide →
📦

Why You Need a Prompt Registry

Vector DB is memory. LLMs are compute. The Prompt Registry is the missing source code layer.

Read Article →
🛡️

Guardrails for Agents: Preventing Runaway AI

Budget limits, action policies, and safety checks. Keep your autonomous agents under control.

Read Article →
🚀

Getting Started with PromptOps

Zero to your first versioned prompt in under 5 minutes. SDK setup, project creation, and deployment.

Read Docs →

Ready to take control?

Get started in under 5 minutes. No credit card required.

$ npm install @promptops/sdk