HomeToolsMCPHow It WorksStoriesPhilosophyArchitectureStar on GitHub
All Tools
P
MonitoringFreemiumOpen Source

PORTKEY

Production stack for GenAI builders

MIT

ABOUT

Production LLM systems often sprawl across providers, prompts, retries, fallbacks, logging, and governance rules, which makes them difficult to monitor and operate consistently. Portkey brings gateway routing, observability, prompt management, and guardrails into one platform so teams can run and control AI traffic with fewer custom integrations.

INSTALL
npx @portkey-ai/gateway pip install portkey-ai

INTEGRATION GUIDE

1. Monitor LLM requests, traces, and usage metrics across multiple model providers 2. Route traffic with retries, fallbacks, and provider selection through a unified AI gateway 3. Apply guardrails and governance controls to production AI application traffic 4. Manage prompts and API-backed prompt templates for teams shipping GenAI features

TAGS

observabilitygatewayroutingguardrailspromptsllmproduction