Build and Deploy a Serverless API with Cloudflare Workers, Hono, and D1

Deploy APIs at the edge, globally, in seconds. Cloudflare Workers run your code in 300+ data centers worldwide with zero cold starts. Paired with Hono's blazing-fast routing and D1's serverless SQLite, you get a full-stack API without managing a single server.
What You'll Build
In this tutorial, you'll create a complete task management API with CRUD operations, backed by a D1 SQLite database, built with the Hono framework, and deployed globally on Cloudflare Workers. By the end, you'll have a production-ready API running at the edge.
Features of the final API:
- RESTful endpoints for tasks (create, read, update, delete)
- Input validation and error handling
- SQLite database with migrations
- CORS support for frontend consumption
- Global deployment with near-zero latency
Prerequisites
Before you begin, make sure you have:
- Node.js 18+ installed (download here)
- A Cloudflare account — free tier is sufficient (sign up)
- Wrangler CLI — Cloudflare's development tool (we'll install it)
- Basic familiarity with TypeScript and REST APIs
- A code editor (VS Code recommended)
Cloudflare Workers free tier includes 100,000 requests/day and D1 gives you 5 million row reads/day — more than enough for most projects and prototyping.
Step 1: Install Wrangler and Authenticate
Wrangler is the CLI tool for developing and deploying Cloudflare Workers. Install it globally:
npm install -g wranglerThen authenticate with your Cloudflare account:
wrangler loginThis opens a browser window. Authorize Wrangler, then verify the connection:
wrangler whoamiYou should see your account name and ID.
Step 2: Scaffold the Project
Create a new Hono project configured for Cloudflare Workers:
npm create hono@latest task-apiWhen prompted:
- Which template? →
cloudflare-workers - Package manager? →
npm(or your preference)
Navigate into the project:
cd task-api
npm installYour project structure looks like this:
task-api/
├── src/
│ └── index.ts # Main application entry
├── wrangler.toml # Cloudflare configuration
├── package.json
└── tsconfig.json
Step 3: Create the D1 Database
D1 is Cloudflare's serverless SQLite database. Create one for your project:
wrangler d1 create task-dbThe output will include a database ID. Copy it — you need it for the configuration:
✅ Successfully created DB 'task-db'
[[d1_databases]]
binding = "DB"
database_name = "task-db"
database_id = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
Open wrangler.toml and add the D1 binding:
name = "task-api"
main = "src/index.ts"
compatibility_date = "2026-02-25"
[[d1_databases]]
binding = "DB"
database_name = "task-db"
database_id = "YOUR_DATABASE_ID_HERE"Replace YOUR_DATABASE_ID_HERE with the actual database ID from the output.
Step 4: Define the Database Schema
Create a schema.sql file in your project root:
-- schema.sql
DROP TABLE IF EXISTS tasks;
CREATE TABLE tasks (
id INTEGER PRIMARY KEY AUTOINCREMENT,
title TEXT NOT NULL,
description TEXT DEFAULT '',
status TEXT DEFAULT 'pending' CHECK(status IN ('pending', 'in_progress', 'completed')),
priority INTEGER DEFAULT 0 CHECK(priority BETWEEN 0 AND 3),
created_at TEXT DEFAULT (datetime('now')),
updated_at TEXT DEFAULT (datetime('now'))
);
-- Seed some initial data
INSERT INTO tasks (title, description, status, priority) VALUES
('Set up CI/CD pipeline', 'Configure GitHub Actions for automated deployments', 'pending', 2),
('Write API documentation', 'Create OpenAPI spec for the task management API', 'in_progress', 1),
('Design database schema', 'Finalize the ERD for the project', 'completed', 3);Apply the schema to your local D1 database (for development):
wrangler d1 execute task-db --local --file=./schema.sqlAnd to the remote (production) database:
wrangler d1 execute task-db --remote --file=./schema.sqlThe --remote flag modifies your production database. In a real project, use D1 migrations (wrangler d1 migrations) for safe, versioned schema changes.
Step 5: Define TypeScript Types
Create src/types.ts to define your data models and bindings:
// src/types.ts
export interface Env {
DB: D1Database;
}
export interface Task {
id: number;
title: string;
description: string;
status: 'pending' | 'in_progress' | 'completed';
priority: number;
created_at: string;
updated_at: string;
}
export interface CreateTaskInput {
title: string;
description?: string;
status?: Task['status'];
priority?: number;
}
export interface UpdateTaskInput {
title?: string;
description?: string;
status?: Task['status'];
priority?: number;
}Step 6: Build the API Routes
Now for the core of the application. Replace the contents of src/index.ts with a full Hono application:
// src/index.ts
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import type { Env, Task, CreateTaskInput, UpdateTaskInput } from './types';
const app = new Hono<{ Bindings: Env }>();
// ─── Middleware ───────────────────────────────────────────────
app.use('/*', cors());
// ─── Health Check ────────────────────────────────────────────
app.get('/', (c) => {
return c.json({
status: 'ok',
service: 'Task Management API',
version: '1.0.0',
timestamp: new Date().toISOString(),
});
});
// ─── GET /tasks ──────────────────────────────────────────────
// List all tasks with optional filtering
app.get('/tasks', async (c) => {
const status = c.req.query('status');
const sortBy = c.req.query('sort') || 'created_at';
const order = c.req.query('order') || 'desc';
const limit = Math.min(parseInt(c.req.query('limit') || '50'), 100);
const offset = parseInt(c.req.query('offset') || '0');
let query = 'SELECT * FROM tasks';
const params: string[] = [];
if (status) {
query += ' WHERE status = ?';
params.push(status);
}
// Only allow safe column names for sorting
const allowedSorts = ['created_at', 'updated_at', 'priority', 'title'];
const safeSort = allowedSorts.includes(sortBy) ? sortBy : 'created_at';
const safeOrder = order.toUpperCase() === 'ASC' ? 'ASC' : 'DESC';
query += ` ORDER BY ${safeSort} ${safeOrder} LIMIT ? OFFSET ?`;
params.push(limit.toString(), offset.toString());
try {
const result = await c.env.DB.prepare(query)
.bind(...params)
.all<Task>();
return c.json({
tasks: result.results,
meta: {
total: result.results.length,
limit,
offset,
},
});
} catch (error) {
return c.json({ error: 'Failed to fetch tasks' }, 500);
}
});
// ─── GET /tasks/:id ─────────────────────────────────────────
// Get a single task by ID
app.get('/tasks/:id', async (c) => {
const id = c.req.param('id');
try {
const task = await c.env.DB.prepare(
'SELECT * FROM tasks WHERE id = ?'
)
.bind(id)
.first<Task>();
if (!task) {
return c.json({ error: 'Task not found' }, 404);
}
return c.json({ task });
} catch (error) {
return c.json({ error: 'Failed to fetch task' }, 500);
}
});
// ─── POST /tasks ─────────────────────────────────────────────
// Create a new task
app.post('/tasks', async (c) => {
let body: CreateTaskInput;
try {
body = await c.req.json<CreateTaskInput>();
} catch {
return c.json({ error: 'Invalid JSON body' }, 400);
}
// Validation
if (!body.title || body.title.trim().length === 0) {
return c.json({ error: 'Title is required' }, 400);
}
if (body.title.length > 255) {
return c.json({ error: 'Title must be 255 characters or less' }, 400);
}
const validStatuses = ['pending', 'in_progress', 'completed'];
if (body.status && !validStatuses.includes(body.status)) {
return c.json({ error: `Status must be one of: ${validStatuses.join(', ')}` }, 400);
}
if (body.priority !== undefined && (body.priority < 0 || body.priority > 3)) {
return c.json({ error: 'Priority must be between 0 and 3' }, 400);
}
try {
const result = await c.env.DB.prepare(
`INSERT INTO tasks (title, description, status, priority)
VALUES (?, ?, ?, ?)
RETURNING *`
)
.bind(
body.title.trim(),
body.description || '',
body.status || 'pending',
body.priority ?? 0
)
.first<Task>();
return c.json({ task: result }, 201);
} catch (error) {
return c.json({ error: 'Failed to create task' }, 500);
}
});
// ─── PUT /tasks/:id ─────────────────────────────────────────
// Update an existing task
app.put('/tasks/:id', async (c) => {
const id = c.req.param('id');
let body: UpdateTaskInput;
try {
body = await c.req.json<UpdateTaskInput>();
} catch {
return c.json({ error: 'Invalid JSON body' }, 400);
}
// Check task exists
const existing = await c.env.DB.prepare(
'SELECT * FROM tasks WHERE id = ?'
)
.bind(id)
.first<Task>();
if (!existing) {
return c.json({ error: 'Task not found' }, 404);
}
// Build dynamic update
const updates: string[] = [];
const values: (string | number)[] = [];
if (body.title !== undefined) {
if (body.title.trim().length === 0) {
return c.json({ error: 'Title cannot be empty' }, 400);
}
updates.push('title = ?');
values.push(body.title.trim());
}
if (body.description !== undefined) {
updates.push('description = ?');
values.push(body.description);
}
if (body.status !== undefined) {
const validStatuses = ['pending', 'in_progress', 'completed'];
if (!validStatuses.includes(body.status)) {
return c.json({ error: `Status must be one of: ${validStatuses.join(', ')}` }, 400);
}
updates.push('status = ?');
values.push(body.status);
}
if (body.priority !== undefined) {
if (body.priority < 0 || body.priority > 3) {
return c.json({ error: 'Priority must be between 0 and 3' }, 400);
}
updates.push('priority = ?');
values.push(body.priority);
}
if (updates.length === 0) {
return c.json({ error: 'No fields to update' }, 400);
}
updates.push("updated_at = datetime('now')");
values.push(id);
try {
const result = await c.env.DB.prepare(
`UPDATE tasks SET ${updates.join(', ')} WHERE id = ? RETURNING *`
)
.bind(...values)
.first<Task>();
return c.json({ task: result });
} catch (error) {
return c.json({ error: 'Failed to update task' }, 500);
}
});
// ─── DELETE /tasks/:id ───────────────────────────────────────
// Delete a task
app.delete('/tasks/:id', async (c) => {
const id = c.req.param('id');
try {
const existing = await c.env.DB.prepare(
'SELECT id FROM tasks WHERE id = ?'
)
.bind(id)
.first();
if (!existing) {
return c.json({ error: 'Task not found' }, 404);
}
await c.env.DB.prepare('DELETE FROM tasks WHERE id = ?')
.bind(id)
.run();
return c.json({ message: 'Task deleted successfully' });
} catch (error) {
return c.json({ error: 'Failed to delete task' }, 500);
}
});
// ─── 404 Handler ─────────────────────────────────────────────
app.notFound((c) => {
return c.json({ error: 'Not found' }, 404);
});
// ─── Error Handler ───────────────────────────────────────────
app.onError((err, c) => {
console.error('Unhandled error:', err);
return c.json({ error: 'Internal server error' }, 500);
});
export default app;Step 7: Test Locally
Wrangler provides a local development server that emulates the Workers runtime, including D1:
wrangler devYour API is now running at http://localhost:8787. Test it with curl:
# Health check
curl http://localhost:8787/
# List all tasks
curl http://localhost:8787/tasks
# Create a new task
curl -X POST http://localhost:8787/tasks \
-H "Content-Type: application/json" \
-d '{"title": "Learn Cloudflare Workers", "description": "Complete the tutorial", "priority": 2}'
# Get a specific task
curl http://localhost:8787/tasks/1
# Update a task
curl -X PUT http://localhost:8787/tasks/1 \
-H "Content-Type: application/json" \
-d '{"status": "completed"}'
# Delete a task
curl -X DELETE http://localhost:8787/tasks/4
# Filter by status
curl "http://localhost:8787/tasks?status=pending&sort=priority&order=desc"You should see JSON responses for each operation. The local D1 database persists between restarts in .wrangler/state/.
Hot reloading: Wrangler watches your files and automatically reloads when you save changes. No need to restart the dev server.
Step 8: Add Request Validation Middleware
For a more robust API, let's add a reusable validation middleware. Create src/middleware.ts:
// src/middleware.ts
import { Context, Next } from 'hono';
export function validateJson() {
return async (c: Context, next: Next) => {
if (['POST', 'PUT', 'PATCH'].includes(c.req.method)) {
const contentType = c.req.header('content-type');
if (!contentType?.includes('application/json')) {
return c.json(
{ error: 'Content-Type must be application/json' },
415
);
}
}
await next();
};
}
export function requestLogger() {
return async (c: Context, next: Next) => {
const start = Date.now();
await next();
const duration = Date.now() - start;
console.log(
`${c.req.method} ${c.req.path} → ${c.res.status} (${duration}ms)`
);
};
}
export function rateLimit(maxRequests: number, windowMs: number) {
const requests = new Map<string, { count: number; resetAt: number }>();
return async (c: Context, next: Next) => {
const ip = c.req.header('cf-connecting-ip') || 'unknown';
const now = Date.now();
const record = requests.get(ip);
if (!record || now > record.resetAt) {
requests.set(ip, { count: 1, resetAt: now + windowMs });
} else if (record.count >= maxRequests) {
return c.json({ error: 'Too many requests' }, 429);
} else {
record.count++;
}
await next();
};
}Then add the middleware to your index.ts:
import { validateJson, requestLogger } from './middleware';
// Add after cors()
app.use('/*', requestLogger());
app.use('/tasks/*', validateJson());Step 9: Add D1 Migrations (Production Best Practice)
Instead of running raw SQL, use Wrangler's migration system for production:
# Create a migrations directory
wrangler d1 migrations create task-db initThis creates a file in migrations/. Paste your schema SQL into it. Then apply:
# Apply locally
wrangler d1 migrations apply task-db --local
# Apply to production
wrangler d1 migrations apply task-db --remoteFuture schema changes become new migration files, giving you version control over your database schema.
Step 10: Deploy to Production
Deploying is a single command:
wrangler deployOutput:
⛅️ wrangler 3.x.x
Uploaded task-api (1.42 sec)
Published task-api (0.35 sec)
https://task-api.YOUR_SUBDOMAIN.workers.dev
Your API is now live on Cloudflare's global network! Test it:
curl https://task-api.YOUR_SUBDOMAIN.workers.dev/tasksZero cold starts. Unlike AWS Lambda or Google Cloud Functions, Cloudflare Workers start in under 5ms. Your API responds instantly, from any location on Earth.
Step 11: Add a Custom Domain (Optional)
If you want your API on a custom domain, add a route in wrangler.toml:
routes = [
{ pattern = "api.yourdomain.com/*", zone_name = "yourdomain.com" }
]Make sure the domain is added to your Cloudflare account. Then redeploy:
wrangler deployYour API is now available at https://api.yourdomain.com/tasks.
Step 12: Monitor and Debug
Cloudflare provides real-time logging for your Workers:
# Stream live logs from production
wrangler tailYou'll see every request, response status, and any console.log output in real time. For more advanced monitoring, check the Workers Analytics dashboard in Cloudflare's UI.
To debug locally with breakpoints, use:
wrangler dev --inspectThen connect Chrome DevTools to the displayed URL.
Project Structure (Final)
task-api/
├── src/
│ ├── index.ts # Main Hono app with routes
│ ├── types.ts # TypeScript interfaces
│ └── middleware.ts # Custom middleware
├── migrations/
│ └── 0001_init.sql # D1 migration
├── schema.sql # Initial schema (reference)
├── wrangler.toml # Cloudflare configuration
├── package.json
└── tsconfig.json
Performance Comparison
Why choose this stack? Here's how it compares:
| Metric | Cloudflare Workers + D1 | AWS Lambda + RDS | Vercel Serverless |
|---|---|---|---|
| Cold start | < 5ms | 100-500ms | 50-250ms |
| Global distribution | 300+ locations | Region-based | Region-based |
| Database latency | Co-located D1 | VPC-dependent | External DB |
| Free tier | 100K req/day | 1M req/month | 100K req/month |
| Pricing (paid) | $0.30/million req | $0.20/million + compute | $0.60/million req |
Tips and Best Practices
Keep Workers lightweight. The 1MB compressed size limit encourages small, focused services. If your API grows large, split it into multiple Workers.
-
Use D1 batching for bulk operations —
db.batch([stmt1, stmt2])runs multiple statements in a single round trip. -
Enable Smart Placement in
wrangler.tomlto let Cloudflare auto-place your Worker near your D1 database:[placement] mode = "smart" -
Use bindings for secrets instead of hardcoding API keys:
wrangler secret put API_KEYAccess via
c.env.API_KEYin your code. -
Leverage Hono's built-in middleware — it includes JWT auth, bearer auth, basic auth, ETag, and more:
import { bearerAuth } from 'hono/bearer-auth'; app.use('/admin/*', bearerAuth({ token: 'secret' })); -
Use
ctx.executionCtx.waitUntil()for fire-and-forget tasks like logging or analytics that shouldn't block the response.
What's Next?
You now have a production-ready serverless API running globally. Here are ideas to extend it:
- Add authentication with Cloudflare Access or JWT tokens
- Implement pagination with cursor-based pagination for large datasets
- Add full-text search using D1's FTS5 extension
- Create a frontend with React/Next.js consuming your API
- Set up CI/CD with GitHub Actions and
wrangler deploy - Add caching with the Cache API or Cloudflare KV for read-heavy endpoints
Summary
In this tutorial, you built a complete serverless REST API using three powerful technologies:
- Cloudflare Workers — serverless compute running at the edge with zero cold starts
- Hono — an ultrafast web framework designed for edge runtimes
- D1 — Cloudflare's serverless SQLite database with global replication
You set up the project, defined a database schema, built CRUD routes with validation and error handling, tested locally, and deployed globally — all without provisioning or managing any servers. The entire stack runs on Cloudflare's free tier, making it an excellent choice for side projects, MVPs, and production APIs alike.
The serverless edge is no longer the future — it's the present. Start building. 🚀
Discuss Your Project with Us
We're here to help with your web development needs. Schedule a call to discuss your project and how we can assist you.
Let's find the best solutions for your needs.
Related Articles

Building an Autonomous AI Agent with Agentic RAG and Next.js
Learn how to build an AI agent that autonomously decides when and how to retrieve information from vector databases. A comprehensive hands-on guide using Vercel AI SDK and Next.js with executable examples.

Building AI Agents from Scratch with TypeScript: Master the ReAct Pattern Using the Vercel AI SDK
Learn how to build AI agents from the ground up using TypeScript. This tutorial covers the ReAct pattern, tool calling, multi-step reasoning, and production-ready agent loops with the Vercel AI SDK.

Build Your First MCP Server with TypeScript: Tools, Resources, and Prompts
Learn how to build a production-ready MCP server from scratch using TypeScript. This hands-on tutorial covers tools, resources, prompts, stdio transport, and connecting to Claude Desktop and Cursor.