Running OpenHands AI Coding Agent on Cloudflare Workers
Learn how to deploy autonomous AI coding agents on Cloudflare's edge infrastructure with OpenHands. Complete guide with SDK, examples, and deployment instructions for building AI-powered developer tools.
I’ve been experimenting with running autonomous AI agents on Cloudflare’s edge infrastructure, and I’m excited to share what I’ve built: a SDK that lets you run OpenHands directly inside Cloudflare Workers.
What is OpenHands?
OpenHands is an autonomous AI coding agent that can execute code, analyze files, browse the web, and solve complex programming tasks. Think of it as having an AI pair programmer that can actually run commands and make changes to your codebase. It’s powered by Claude and has some seriously impressive capabilities.
Why Run It on Cloudflare Workers?
Running OpenHands on the edge has some compelling advantages:
- Global Distribution: Your AI agent runs close to your users, wherever they are
- Serverless Simplicity: No infrastructure to manage, just deploy and go
- Cloudflare Sandbox: Secure, isolated containers that can run full Linux environments
- Cost Effective: Pay only for what you use, with Cloudflare’s generous pricing
The Challenge
OpenHands typically runs as a full agent-server application, which isn’t exactly the typical Workers use case. You need:
- A container environment to run the OpenHands server
- Proper networking and port management
- Request proxying and state management
- Integration with Durable Objects for persistence
This is where the cloudflare-openhands-sdk comes in.
What I Built
The SDK provides everything you need to run OpenHands on Cloudflare Workers:
- Pre-configured Dockerfile: Builds the OpenHands agent-server with all dependencies
- Simple API: High-level functions to create, manage, and proxy to OpenHands instances
- Built-in Routes: Optional routes for starting, stopping, and checking server status
- TypeScript Support: Fully typed for a great developer experience
Quick Start
Getting started is literally one command:
npm create cloudflare@latest -- openhands-example --template=jagzmz/cloudflare-openhands-sdk/examples/openhands
This creates a complete example with an /ask endpoint where you can send questions to the AI agent:
# Set your Anthropic API key
cp .dev.vars.example .dev.vars
# Edit .dev.vars with your key
# Start the dev server
npm run dev
# Ask the agent anything
MESSAGE="Write a haiku about edge computing"
curl --get --data-urlencode "message=$MESSAGE" "http://localhost:8787/ask"
How It Works
Under the hood, the SDK:
- Creates a Cloudflare Sandbox instance with the OpenHands container
- Starts the agent-server inside the container on port 8001
- Proxies requests from your Worker to the agent-server
- Manages conversations with the OpenHands API
- Handles cleanup and resource management
Here’s a simplified example of using the SDK directly:
import { getSandbox } from '@cloudflare/sandbox';
import { createOpenhandsServer, proxyToOpenhands } from 'cloudflare-openhands-sdk/openhands';
const sandbox = getSandbox(env.Sandbox, 'my-sandbox');
const server = await createOpenhandsServer(sandbox, {
port: 8001,
exposePort: true,
hostname: 'yourdomain.com'
});
return proxyToOpenhands(request, sandbox, server);
Real-World Use Cases
I can see this being useful for:
- AI-powered dev tools: Build coding assistants that run on the edge
- Automated code review: Let OpenHands analyze PRs and suggest improvements
- Interactive documentation: Documentation that can answer questions and show examples
- Internal tools: Company-specific AI agents with custom prompts and context
Technical Details
The example demonstrates several advanced patterns:
- Custom Request Handlers: Build your own logic on top of the SDK
- Agent Configuration: Customize the LLM model, temperature, and working directory
- Durable Objects: Persistent storage for conversation state
- Container Management: Proper handling of container lifecycle and ports
Deployment
Deploying to production is straightforward:
npm run deploy
The container image gets built and pushed to Cloudflare’s registry. Wait for provisioning (you can check status in the Cloudflare Dashboard), and you’re live!
Don’t forget to add your Anthropic API key as a secret:
npx wrangler secret put ANTHROPIC_API_KEY
What’s Next?
This is just the beginning. Some ideas I’m exploring:
- Multi-turn conversation support with persistent sessions
- Custom agent configurations and tools
- Integration with other Cloudflare services (R2, KV, D1)
- WebSocket support for real-time agent interactions
Try It Out
The code is open source and available on GitHub:
If you’re interested in AI agents, edge computing, or just want to experiment with something cool, give it a try! I’d love to hear what you build with it.
Acknowledgments
Shoutout to the Cloudflare Sandbox SDK opencode example which inspired this project.
Have questions or feedback? Find me on the socials below.