Long-running jobs that survive crashes, retries, and restarts — without a queue, a database, or a single server. Cloudflare Workflows brings durable execution to Workers: write multi-step code as if it were synchronous, and let the platform handle persistence, retries, and recovery for you.
What You'll Build
In this tutorial, you'll build a complete order processing pipeline as a Cloudflare Workflow. The pipeline:
- Validates an incoming order
- Charges the customer's card via an external payment API
- Reserves inventory in a downstream service
- Sleeps until the next business day before scheduling shipment
- Sends a confirmation email
- Recovers gracefully from any step failure with automatic retries
By the end you'll have a production-ready, idempotent workflow that runs at the edge, survives Worker restarts, and can pause for hours or days without consuming compute.
Prerequisites
Before you begin, make sure you have:
- Node.js 20+ installed (download here)
- A Cloudflare account — the free tier supports Workflows (sign up)
- Wrangler CLI v3.99+ — Cloudflare's developer tool (installed below)
- Basic familiarity with TypeScript and async/await
- A code editor (VS Code recommended)
Why Workflows? Traditional Worker scripts must complete within seconds. Workflows can run for minutes, hours, or even days. They're built for the messy reality of distributed systems: external APIs go down, payments need retries, and humans need time to approve things.
Step 1: Install Wrangler and Scaffold the Project
Start by installing Wrangler globally and creating a new Workers project with Workflows support.
npm install -g wrangler@latest
wrangler loginA browser window will open to authenticate against Cloudflare. Once you're back in the terminal, scaffold a new TypeScript project:
npm create cloudflare@latest noqta-order-pipeline -- \
--type=hello-world \
--lang=ts \
--git=true \
--deploy=false
cd noqta-order-pipelineThis creates a minimal Worker with TypeScript, Vitest, and a wrangler.jsonc config file.
Step 2: Enable Workflows in wrangler.jsonc
Open wrangler.jsonc and add a workflows binding. This tells Cloudflare that your Worker exposes a Workflow class named OrderPipeline:
{
"$schema": "node_modules/wrangler/config-schema.json",
"name": "noqta-order-pipeline",
"main": "src/index.ts",
"compatibility_date": "2026-05-01",
"compatibility_flags": ["nodejs_compat"],
"workflows": [
{
"name": "order-pipeline",
"binding": "ORDER_PIPELINE",
"class_name": "OrderPipeline"
}
],
"observability": {
"enabled": true
}
}The binding is how your HTTP handler refers to the workflow at runtime. The class_name must match the class you'll export from your code.
Step 3: Model the Order Payload
Create a src/types.ts file with strict types for the order data flowing through the pipeline:
// src/types.ts
export interface OrderItem {
sku: string;
quantity: number;
unitPriceCents: number;
}
export interface OrderParams {
orderId: string;
customerId: string;
customerEmail: string;
paymentToken: string;
items: OrderItem[];
}
export interface PaymentReceipt {
transactionId: string;
chargedCents: number;
chargedAt: string;
}
export interface InventoryReservation {
reservationId: string;
reservedAt: string;
}These types act as the contract between every step of the workflow. Because Workflows persist step outputs to durable storage, every value you return must be JSON-serializable — interfaces with primitives only.
Step 4: Write the Workflow Class
Now the centerpiece. Replace src/index.ts with the following:
// src/index.ts
import {
WorkflowEntrypoint,
WorkflowEvent,
WorkflowStep,
} from "cloudflare:workers";
import type {
OrderParams,
PaymentReceipt,
InventoryReservation,
} from "./types";
interface Env {
ORDER_PIPELINE: Workflow;
PAYMENT_API_KEY: string;
INVENTORY_API_URL: string;
EMAIL_API_URL: string;
}
export class OrderPipeline extends WorkflowEntrypoint<Env, OrderParams> {
async run(event: WorkflowEvent<OrderParams>, step: WorkflowStep) {
const order = event.payload;
// Step 1 — validate
await step.do("validate-order", async () => {
if (order.items.length === 0) {
throw new Error("Order has no items");
}
const total = order.items.reduce(
(sum, i) => sum + i.unitPriceCents * i.quantity,
0
);
if (total <= 0) throw new Error("Order total must be positive");
return { total };
});
// Step 2 — charge payment with retries
const receipt = await step.do<PaymentReceipt>(
"charge-payment",
{
retries: {
limit: 5,
delay: "10 seconds",
backoff: "exponential",
},
timeout: "30 seconds",
},
async () => chargeCard(order, this.env.PAYMENT_API_KEY)
);
// Step 3 — reserve inventory
const reservation = await step.do<InventoryReservation>(
"reserve-inventory",
{ retries: { limit: 3, delay: "5 seconds", backoff: "exponential" } },
async () => reserveInventory(order, this.env.INVENTORY_API_URL)
);
// Step 4 — wait until the next business day
await step.sleepUntil("wait-next-business-day", nextBusinessDay());
// Step 5 — confirmation email
await step.do("send-confirmation", async () =>
sendEmail(order.customerEmail, receipt, reservation, this.env.EMAIL_API_URL)
);
return {
status: "completed",
transactionId: receipt.transactionId,
reservationId: reservation.reservationId,
};
}
}
export default {
async fetch(req: Request, env: Env): Promise<Response> {
if (req.method !== "POST") {
return new Response("Use POST to start an order", { status: 405 });
}
const body = (await req.json()) as OrderParams;
const instance = await env.ORDER_PIPELINE.create({
id: body.orderId,
params: body,
});
return Response.json({
instanceId: instance.id,
status: await instance.status(),
});
},
} satisfies ExportedHandler<Env>;A few things worth pausing on:
step.do(name, ...)is the durable primitive. Each named step runs at most once successfully; its return value is persisted and replayed on resume.- The
retriesconfig is declarative. IfchargeCardthrows, the platform retries with exponential backoff — your code stays clean. step.sleepUntilreleases compute. The workflow consumes zero CPU while sleeping and resumes at the scheduled time.- Returning a value from
step.domakes it available to later steps even after the Worker is evicted from memory.
Step 5: Implement the External Calls
Append the helper functions to src/index.ts (or extract them into their own module):
async function chargeCard(
order: OrderParams,
apiKey: string
): Promise<PaymentReceipt> {
const total = order.items.reduce(
(sum, i) => sum + i.unitPriceCents * i.quantity,
0
);
const res = await fetch("https://payments.example.com/charge", {
method: "POST",
headers: {
Authorization: `Bearer ${apiKey}`,
"Content-Type": "application/json",
"Idempotency-Key": order.orderId,
},
body: JSON.stringify({
token: order.paymentToken,
amount: total,
currency: "USD",
}),
});
if (!res.ok) throw new Error(`Payment failed: ${res.status}`);
const data = (await res.json()) as { id: string; amount: number };
return {
transactionId: data.id,
chargedCents: data.amount,
chargedAt: new Date().toISOString(),
};
}
async function reserveInventory(
order: OrderParams,
apiUrl: string
): Promise<InventoryReservation> {
const res = await fetch(`${apiUrl}/reservations`, {
method: "POST",
headers: {
"Content-Type": "application/json",
"Idempotency-Key": `${order.orderId}-inv`,
},
body: JSON.stringify({ orderId: order.orderId, items: order.items }),
});
if (!res.ok) throw new Error(`Inventory error: ${res.status}`);
const data = (await res.json()) as { id: string };
return {
reservationId: data.id,
reservedAt: new Date().toISOString(),
};
}
async function sendEmail(
to: string,
receipt: PaymentReceipt,
reservation: InventoryReservation,
apiUrl: string
) {
await fetch(`${apiUrl}/send`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
to,
subject: "Order confirmation",
body: `Charged ${receipt.chargedCents} cents. Reservation: ${reservation.reservationId}`,
}),
});
}
function nextBusinessDay(): Date {
const now = new Date();
const next = new Date(now);
next.setUTCDate(now.getUTCDate() + 1);
next.setUTCHours(9, 0, 0, 0);
const day = next.getUTCDay();
if (day === 6) next.setUTCDate(next.getUTCDate() + 2);
if (day === 0) next.setUTCDate(next.getUTCDate() + 1);
return next;
}Idempotency keys are critical. Workflow steps may execute more than once if a Worker is killed mid-flight. Pass an idempotency key to every external API that mutates state, scoped to the step name and the workflow instance ID.
Step 6: Add Secrets and Local Bindings
Workflows read secrets the same way regular Workers do. For local development, create a .dev.vars file in your project root:
PAYMENT_API_KEY="test_sk_local_xxxxxxxxxxxxxxxx"
INVENTORY_API_URL="https://staging.inventory.example.com"
EMAIL_API_URL="https://staging.email.example.com"For production, push the secrets to Cloudflare:
wrangler secret put PAYMENT_API_KEY
wrangler secret put INVENTORY_API_URL
wrangler secret put EMAIL_API_URLEach command prompts you for the value and stores it encrypted in Cloudflare's secret store.
Step 7: Run the Workflow Locally
Boot the dev server with full workflow simulation:
wrangler dev --x-dev-envThe CLI prints a local URL such as http://127.0.0.1:8787. In another terminal, kick off an order:
curl -X POST http://127.0.0.1:8787 \
-H "Content-Type: application/json" \
-d '{
"orderId": "ord_2026_001",
"customerId": "cust_42",
"customerEmail": "ada@example.com",
"paymentToken": "tok_fake_visa",
"items": [
{ "sku": "TSHIRT-RED-L", "quantity": 2, "unitPriceCents": 1999 }
]
}'You'll receive a JSON response with an instanceId. Open http://127.0.0.1:8787/__workflows in your browser (Wrangler exposes a built-in inspector) and watch each step transition through running, success, and the long sleeping phase.
Step 8: Inspect and Control Running Instances
Add a second route to query and control workflows. Replace the fetch handler with this richer router:
export default {
async fetch(req: Request, env: Env): Promise<Response> {
const url = new URL(req.url);
const id = url.searchParams.get("id");
if (req.method === "POST" && url.pathname === "/orders") {
const body = (await req.json()) as OrderParams;
const instance = await env.ORDER_PIPELINE.create({
id: body.orderId,
params: body,
});
return Response.json({ instanceId: instance.id });
}
if (req.method === "GET" && url.pathname === "/orders" && id) {
const instance = await env.ORDER_PIPELINE.get(id);
return Response.json(await instance.status());
}
if (req.method === "POST" && url.pathname === "/orders/pause" && id) {
const instance = await env.ORDER_PIPELINE.get(id);
await instance.pause();
return Response.json({ paused: true });
}
if (req.method === "POST" && url.pathname === "/orders/resume" && id) {
const instance = await env.ORDER_PIPELINE.get(id);
await instance.resume();
return Response.json({ resumed: true });
}
return new Response("Not found", { status: 404 });
},
} satisfies ExportedHandler<Env>;You can now:
POST /orders— start a new order workflowGET /orders?id=ord_2026_001— read the current step, status, and elapsed timePOST /orders/pause?id=...— pause a workflow indefinitelyPOST /orders/resume?id=...— resume a paused workflow
This is the foundation for an admin dashboard or a customer support tool that lets agents pause shipments mid-flight.
Step 9: Add a Human-in-the-Loop Approval Step
For high-value orders, suppose you want a human to approve before payment. Use step.waitForEvent:
const total = order.items.reduce(
(sum, i) => sum + i.unitPriceCents * i.quantity,
0
);
if (total > 50000) {
const decision = await step.waitForEvent<{ approved: boolean }>(
"wait-for-manager-approval",
{
type: "order.approval",
timeout: "24 hours",
}
);
if (!decision.payload.approved) {
return { status: "rejected" };
}
}Then expose a route to deliver the event:
if (req.method === "POST" && url.pathname === "/orders/approve" && id) {
const body = (await req.json()) as { approved: boolean };
const instance = await env.ORDER_PIPELINE.get(id);
await instance.sendEvent({ type: "order.approval", payload: body });
return Response.json({ delivered: true });
}While waiting, the workflow consumes no compute and survives indefinite Worker restarts. When the manager approves through your admin UI, the workflow resumes exactly where it left off.
Step 10: Write a Vitest Integration Test
Workflows are testable end-to-end with the Cloudflare Workers Vitest pool. Add this to test/order.spec.ts:
import { env, createExecutionContext, waitOnExecutionContext } from "cloudflare:test";
import { describe, it, expect } from "vitest";
import worker from "../src/index";
describe("OrderPipeline", () => {
it("creates a workflow instance for a valid order", async () => {
const req = new Request("http://example.com/orders", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
orderId: `ord_${crypto.randomUUID()}`,
customerId: "cust_1",
customerEmail: "test@example.com",
paymentToken: "tok_test",
items: [{ sku: "SKU1", quantity: 1, unitPriceCents: 500 }],
}),
});
const ctx = createExecutionContext();
const res = await worker.fetch(req, env, ctx);
await waitOnExecutionContext(ctx);
expect(res.status).toBe(200);
const body = (await res.json()) as { instanceId: string };
expect(body.instanceId).toBeDefined();
});
});Run it:
npm testTests run against an in-memory Workers runtime — no network, no real Cloudflare calls.
Step 11: Deploy to Production
When you're ready to ship:
wrangler deployWrangler uploads your Worker, registers the OrderPipeline workflow class, and prints a public URL. From this moment on, the workflow runs across Cloudflare's global edge network with built-in persistence.
Hit it with a real order:
curl -X POST https://noqta-order-pipeline.your-subdomain.workers.dev/orders \
-H "Content-Type: application/json" \
-d @order.jsonOpen the Cloudflare dashboard, navigate to Workers and Pages → Workflows, and click your instance to see a step-by-step timeline of every retry, sleep, and event.
Testing Your Implementation
Verify the workflow against this checklist:
- A valid order returns an
instanceIdand progresses through every step. - Killing the dev server mid-flight and restarting it resumes from the last completed step.
- Forcing
chargeCardto throw shows the retries firing with exponential backoff. - Calling
/orders/pausehalts execution;/orders/resumecontinues from the exact step. - High-value orders block at
wait-for-manager-approvaluntil you deliver the event.
Troubleshooting
Workflow class not found — confirm the class_name in wrangler.jsonc exactly matches the exported class.
Step output is not serializable — return only JSON-safe values from step.do. Wrap Date objects with .toISOString() and avoid Map, Set, or class instances.
Steps execute more than once — that's by design when a Worker crashes mid-step. Use idempotency keys on every external mutation.
Sleeps fire immediately during local dev — older Wrangler versions accelerated sleeps. Upgrade to wrangler@3.99 or newer, or pass --x-dev-env to use the production timer simulation.
Cannot find module cloudflare:workers — make sure your tsconfig.json includes "types": ["@cloudflare/workers-types/2026-05-01"] and that you've installed @cloudflare/workers-types.
Next Steps
- Layer Queues between user-facing requests and workflow creation so spikes don't overload downstream APIs. See our Cloudflare Workers + Hono + D1 tutorial.
- Persist workflow outputs to D1 or R2 for long-term querying alongside your existing data.
- Compare with Temporal durable workflows in TypeScript when you need on-prem or multi-cloud portability.
- Wire workflows into Inngest event-driven flows for richer observability around event sourcing.
- Add structured logs with OpenTelemetry tracing on every step boundary.
Conclusion
Cloudflare Workflows reshapes how we write long-running serverless code. Instead of cobbling together queues, cron schedulers, dead-letter inboxes, and retry state machines, you write straight-line TypeScript and let the platform handle durability. Combined with Workers, D1, Queues, and R2, it forms the missing piece in Cloudflare's full-stack story — and brings durable execution within reach of any team comfortable with npm install.
In this tutorial you went from a blank project to a production-grade order pipeline with retries, sleeps, pause and resume, human-in-the-loop approvals, and integration tests. Take it as a template: swap in your own external APIs and you have a resilient backend that scales to zero, costs cents per million invocations, and runs in every Cloudflare data center on Earth.