Ship prompt changes without code deploys. Manage versions, promote across environments, and resolve the right prompt at runtime — all through a type-safe SDK.
Every prompt change requires a code review, a deploy, and a prayer. PromptOps decouples your prompts from your code.
Install the SDK. Fetch your prompt. Render with variables. That's it.
import { PromptOps } from '@promptops/sdk'
// Initialize with your API key
const promptOps = new PromptOps({
apiKey: process.env.PROMPTOPS_API_KEY,
baseUrl: 'https://api.promptops.dev',
})
// Fetch the active prompt for your environment
const prompt = await promptOps.getPrompt('support-classifier', {
environment: 'production', // or 'dev', 'staging'
})
// Render with variables
const message = promptOps.render(prompt, {
ticketContent: userMessage,
customerTier: 'enterprise',
})
// Use with any LLM
const response = await openai.chat.completions.create({
model: prompt.model, // "gpt-4" (from PromptOps)
temperature: prompt.temperature, // 0.3 (from PromptOps)
messages: [
{ role: 'system', content: prompt.systemPrompt },
{ role: 'user', content: message },
],
})Native fetch, no bloat. Works in Node.js 18+, Deno, Bun, and edge runtimes.
Fetch different prompt versions for dev, staging, and production automatically.
Local caching with TTL + stale fallback. Your app works even if PromptOps is down.
Register a prompt with a human-readable slug via the API or dashboard.
POST /api/v1/promptsAdd new versions with system prompts, templates, model configs, and metadata.
POST /prompts/:id/versionsPromote versions to dev, staging, or production. Rollback instantly if needed.
POST /prompts/:id/promoteYour SDK fetches the active version at runtime. Cached, resilient, fast.
sdk.getPrompt("slug")Everything you need to manage prompts at scale.
Every prompt change creates an immutable version. Full history, full auditability.
dev → staging → production promotion flow. Test before you ship to users.
TypeScript-first with full type inference. Zero runtime dependencies.
One API call to revert any environment to its previous version. No deploys.
RESTful API with every operation. Dashboard is optional — automate everything.
In-memory cache with TTL and stale fallback. Works even when the API is unreachable.
Ship prompt iterations in seconds instead of waiting for code deploys. Test in dev, promote to production with confidence.
promptOps.getPrompt("email-gen", { env: "staging" })Give product and ML teams control over prompts without touching the codebase. API keys scoped per environment.
POST /api/v1/prompts/onboarding/promoteStart iterating on your AI product's prompts without building prompt infrastructure from scratch.
npm install @promptops/sdkDeep dives into prompt infrastructure, AI DevOps, and the future of LLM engineering.
Why hardcoding prompts is technical debt and how to build a prompt CI/CD pipeline.
Read Guide →The era of the "Prompt Whisperer" is over. The AI Engineer has arrived.
Read Article →Your code has pipelines. Your prompts should too. Build automated testing and deployment.
Read Guide →The $3K–$24K/month mistake hiding in your codebase. Five dangers and the fix.
Read Article →Your code has 95% coverage. Your prompts have 0%. Three types of tests to fix it.
Read Article →Run prompt experiments in production without risking your users. Data over gut feelings.
Read Guide →It's not what you ask — it's what the AI knows when you ask. The 90% below the surface.
Read Article →How to write manager prompts that coordinate worker agents in multi-agent systems.
Read Guide →Access control, audit trails, and compliance alignment for production LLM deployments.
Read Guide →Vector DB is memory. LLMs are compute. The Prompt Registry is the missing source code layer.
Read Article →Budget limits, action policies, and safety checks. Keep your autonomous agents under control.
Read Article →Zero to your first versioned prompt in under 5 minutes. SDK setup, project creation, and deployment.
Read Docs →Get started in under 5 minutes. No credit card required.
$ npm install @promptops/sdk