Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/durable-streams/durable-streams/llms.txt

Use this file to discover all available pages before exploring further.

Welcome to Durable Streams

HTTP-based durable streams for streaming data reliably to web browsers, mobile apps, and native clients with offset-based resumability. Durable Streams provides a simple, production-proven protocol for creating and consuming ordered, replayable data streams with support for catch-up reads and live tailing.
Read the Announcing Durable Streams post on the Electric blog.

The Missing Primitive

Modern applications frequently need ordered, durable sequences of data that can be replayed from arbitrary points and tailed in real time. Common patterns include:

AI Conversation Streaming

Stream LLM token responses with resume capability across reconnections

Agentic Apps

Stream tool outputs and progress events with replay and clean reconnect semantics

Database Synchronization

Stream database changes to web, mobile, and native clients

Collaborative Editing

Sync CRDTs and operational transforms across devices

Real-time Updates

Push application state to clients with guaranteed delivery

Event Sourcing

Build event-sourced architectures with client-side replay
While durable streams exist throughout backend infrastructure (database WALs, Kafka topics, event stores), they aren’t available as a first-class primitive for client applications. There’s no simple, HTTP-based durable stream that sits alongside databases and object storage as a standard cloud primitive.

Why Durable Streams?

WebSocket and SSE connections are easy to start, but they’re fragile in practice: tabs get suspended, networks flap, devices switch, pages refresh. When that happens, you either lose in-flight data or build a bespoke backend storage and client resume protocol on top. AI products make this painfully visible. Token streaming is the UI for chat and copilots, and agentic apps stream progress events, tool outputs, and partial results over long-running sessions. When the stream fails, the product fails—even if the model did the right thing. Durable Streams addresses this gap. It’s a minimal HTTP-based protocol for durable, offset-based streaming designed for client applications across all platforms: web browsers, mobile apps, native clients, IoT devices, and edge workers. Based on 1.5 years of production use at Electric for real-time Postgres sync, reliably delivering millions of state changes every day.

What You Get

Refresh-Safe

Users refresh the page, switch tabs, or background the app—they pick up exactly where they left off

Share Links

A stream is a URL. Multiple viewers can watch the same stream together in real-time

Never Re-run

Don’t repeat expensive work because a client disconnected mid-stream

Multi-Device

Start on your phone, continue on your laptop, watch from a shared link—all in sync

Multi-Tab

Works seamlessly across browser tabs without duplicating connections or missing data

Massive Fan-out

CDN-friendly design means one origin can serve millions of concurrent viewers

Protocol Features

The protocol is:
  • Universal: Works anywhere HTTP works: web browsers, mobile apps, native clients, IoT devices, edge workers
  • Simple: Built on standard HTTP with no custom protocols
  • Resumable: Offset-based reads let you resume from any point
  • Real-time: Long-poll and SSE modes for live tailing with catch-up from any offset
  • Economical: HTTP-native design leverages CDN infrastructure for efficient scaling
  • Flexible: Content-type agnostic byte streams
  • Composable: Build higher-level abstractions on top (like Electric’s real-time Postgres sync engine)

Quick Example

Here’s a complete example showing how to create a stream, write data, and read it back:
import { DurableStream, IdempotentProducer } from "@durable-streams/client"

// Create a stream
const stream = await DurableStream.create({
  url: "https://your-server.com/v1/stream/my-stream",
  contentType: "application/json",
})

// Create an idempotent producer for reliable writes
const producer = new IdempotentProducer(stream, "my-service-1", {
  autoClaim: true,
  onError: (err) => console.error("Batch failed:", err),
})

// High-throughput writes - fire-and-forget
for (const event of events) {
  producer.append(JSON.stringify(event))
}

// Ensure all messages are delivered
await producer.flush()
await producer.close()

// Read with the streaming API
const res = await stream.stream<{ event: string; userId: string }>()
res.subscribeJson(async (batch) => {
  for (const item of batch.items) {
    console.log(item)
  }
})

Architecture Overview

Durable Streams sits between your application server and clients, providing durable log semantics over HTTP:
Application Server → Durable Streams → Clients (Web/Mobile/Native)
  (shapes data,        (server-to-client)
   authorizes)
Your application server:
  • Consumes from backend streaming systems (Kafka, RabbitMQ, etc.)
  • Applies authorization logic
  • Shapes data for specific clients
  • Fans out via Durable Streams
Durable Streams provides:
  • Persistent storage: Data survives disconnections and server restarts
  • Offset-based resumption: Resume from any position with well-defined semantics
  • Unified catch-up and live: Same API for historical replay and real-time tailing
  • Multi-reader support: Multiple clients can subscribe to the same stream
  • CDN-friendly: Offset-based URLs enable aggressive caching
  • Stateless servers: Clients track their own offsets

Core Operations

The protocol defines simple HTTP-based operations:
1

Create a Stream

Use PUT to create a new stream with optional initial content
curl -X PUT https://your-server.com/v1/stream/my-stream \
  -H "Content-Type: application/json"
2

Append Data

Use POST to append bytes to a stream
curl -X POST https://your-server.com/v1/stream/my-stream \
  -H "Content-Type: application/json" \
  -d '{"event":"user.created","userId":"123"}'
3

Read from Offset

Use GET with an offset to read from any position
curl "https://your-server.com/v1/stream/my-stream?offset=-1"
4

Live Tail

Add live=long-poll or live=sse for real-time updates
curl "https://your-server.com/v1/stream/my-stream?offset=abc123&live=long-poll"

Key Concepts

Offsets

Offsets are opaque tokens that identify positions within a stream:
  • Opaque strings: Treat as black boxes; don’t parse or construct them
  • Lexicographically sortable: You can compare offsets to determine ordering
  • "-1" means start: Use offset: "-1" to read from the beginning
  • "now" means tail: Use offset: "now" to skip existing data and read only new data
  • Server-generated: Always use the offset value returned in responses

Message Framing

By default, Durable Streams is a raw byte stream with no message boundaries. You must implement your own framing:
// Write with newlines (NDJSON)
await handle.append(JSON.stringify({ event: "user.created" }) + "\n")
await handle.append(JSON.stringify({ event: "user.updated" }) + "\n")

// Parse line by line
const res = await handle.stream({ live: false })
const text = await res.text()
const messages = text.split("\n").filter(Boolean).map(JSON.parse)

Use Cases

AI Conversation Streaming

LLM inference is expensive. When a user’s tab gets suspended or they refresh the page, you don’t want to re-run the generation—you want them to pick up exactly where they left off.
import { DurableStream, IdempotentProducer } from "@durable-streams/client"

// Server: stream tokens to a durable stream
const stream = await DurableStream.create({
  url: `https://your-server.com/v1/stream/generation/${generationId}`,
  contentType: "text/plain",
})

const producer = new IdempotentProducer(stream, generationId, {
  autoClaim: true,
  lingerMs: 10,
})

for await (const token of llm.stream(prompt)) {
  producer.append(token)
}
await producer.flush()
await producer.close()

// Client: resume from last seen position (refresh-safe)
const res = await stream.stream({ offset: lastSeenOffset, live: "auto" })
res.subscribe((chunk) => {
  renderTokens(chunk.data)
  saveOffset(chunk.offset)
})

Database Sync

Stream database changes to web and mobile clients for real-time synchronization:
// Server: stream database changes
for (const change of db.changes()) {
  await handle.append(change)
}

// Client: receive and apply changes
const res = await handle.stream({ offset: lastSeenOffset, live: "auto" })
res.subscribeJson(async (batch) => {
  for (const change of batch.items) {
    applyChange(change)
  }
  saveOffset(batch.offset)
})

Event Sourcing

Build event-sourced systems with durable event logs:
// Append events
await handle.append({ type: "OrderCreated", orderId: "123" })
await handle.append({ type: "OrderPaid", orderId: "123" })

// Replay from beginning
const res = await handle.stream({ offset: "-1", live: false })
const events = await res.json()
const state = events.reduce(applyEvent, initialState)

Next Steps

Quick Start

Get up and running in 5 minutes with our quick start guide

Installation

Install client libraries for your language and set up the server

Protocol Specification

Read the complete protocol specification

GitHub Repository

View the source code and contribute