Accepting Beta Users for LangChain & AG2

Stop writing Dockerfiles.
Just Ship Your Agent.

The "Vercel for Backend Agents". Push your Python code to a serverless URL in one command. Includes built-in circuit breakers to prevent runaway API bills.

Free tier for beta testers. No credit card required.

zsh — 80x24
~/my-agent pip install agent-deploy
~/my-agent agent-deploy init
Detected framework: 🦜🔗 LangChain
Created: agent.config.json
~/my-agent agent-deploy up
📦 Zipping code...
🚀 Uploading to Agent Cloud...
🛡️ Injecting safety middleware...
✅ Live: https://agent-xyz.run.app

Why builders utilize this?

Because setting up AWS ECS for a simple demo is overkill.

Instant Deploy

We scan your `requirements.txt` and build the container environment automatically. No `Dockerfile` hell.

Circuit Breakers

Our middleware detects infinite loops (e.g., 20+ identical steps) and kills the process before it drains your wallet.

Scale to Zero

Your agent only runs when it receives a request. Zero cost for idle time. Perfect for MVPs and demos.

Simple, Transparent Pricing

We are currently in Closed Beta. Early adopters get free compute.

CURRENT

Beta Access

$0/mo
  • 5 Active Agents
  • HTTPS Endpoints
  • Loop Protection (Safety)
  • Community Support
Join Waitlist
PLANNED

Pro (Agency)

$29/mo
  • Unlimited Agents
  • Custom Domain
  • Team Collaboration
  • Priority GPU Access

Frequently Asked Questions

Is this CPU or GPU based?

Currently, we support **CPU-only** workloads, which is perfect for LangChain/AG2 logic layers that call external APIs (OpenAI/Anthropic). We are working on serverless GPU support for local LLMs.

How do you handle API Keys?

We provide a secure `agent-deploy secrets set` command. Your keys are encrypted at rest and injected into the container at runtime. They are never stored in plain text.

Can I use this for production?

We are in Beta. While our infrastructure runs on Google Cloud Run (enterprise-grade), we recommend using it for MVPs, demos, and internal tools while we stabilize the CLI.