Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/durable-streams/durable-streams/llms.txt

Use this file to discover all available pages before exploring further.

Durable Streams provides a production-proven solution for streaming database changes to client applications. Built on 1.5 years of production use at Electric for real-time Postgres sync, it reliably delivers millions of state changes every day.

Why durable streams for database sync

Traditional database sync approaches face several challenges:
  • Network unreliability - Clients disconnect constantly (backgrounded tabs, network switches, page refreshes)
  • Data loss - WebSocket or SSE connections lose in-flight data when connections drop
  • Complex resumption - Building offset tracking and resume logic for each application
  • Scale economics - Per-connection costs make dedicated connections to millions of clients prohibitive
Durable Streams solves these problems with offset-based resumability, HTTP compatibility, and CDN-friendly caching.

Core implementation pattern

1

Server: Stream database changes

Your backend captures database changes and appends them to a durable stream. Each client gets their own stream filtered and authorized for their access level.
import { DurableStream } from "@durable-streams/client"

// Create a stream for this client's authorized data
const stream = await DurableStream.create({
  url: `https://your-server.com/v1/stream/user/${userId}/data`,
  contentType: "application/json",
})

// Listen to database changes and append to stream
for (const change of db.changes()) {
  await stream.append(change) // JSON objects batched automatically
}
2

Client: Receive and apply changes

Clients read from their stream starting from their last known offset. The stream delivers all changes since that point, then transitions to live updates.
import { stream } from "@durable-streams/client"

// Resume from last seen position (stored locally)
const res = await stream({
  url: `https://your-server.com/v1/stream/user/${userId}/data`,
  offset: lastSeenOffset, // Or "-1" to start from beginning
  live: true, // Continue with live updates after catch-up
})

// Subscribe to changes as they arrive
res.subscribeJson(async (batch) => {
  for (const change of batch.items) {
    applyChange(change) // Update local state
  }
  // Save offset for resumption after disconnect
  saveOffset(batch.offset)
})
3

Handle reconnections gracefully

When the client reconnects (after page refresh, network switch, etc.), they resume from their saved offset. No data is lost, no complex state reconciliation needed.
// After page refresh or reconnect
const res = await stream({
  url: `https://your-server.com/v1/stream/user/${userId}/data`,
  offset: getSavedOffset(), // Pick up exactly where we left off
  live: true,
})

Complete example with State Protocol

The @durable-streams/state package provides a higher-level abstraction for building real-time synchronized applications using TanStack DB:
import { createStreamDB, createStateSchema } from "@durable-streams/state"
import { z } from "zod"

// Define your schema
const schema = createStateSchema({
  todos: {
    schema: z.object({
      id: z.string(),
      text: z.string(),
      completed: z.boolean(),
    }),
    type: "todo",
    primaryKey: "id",
  },
})

// Create stream DB
const db = createStreamDB({
  streamOptions: {
    url: `https://your-server.com/v1/stream/workspace/${workspaceId}`,
  },
  state: schema,
})

// Preload starts syncing in the background
await db.preload()

// Query the local state (reactive)
const todos = db.collections.todos.useAll()

// Subscribe to live updates
db.collections.todos.subscribe((todos) => {
  console.log("Todos updated:", todos)
})

Change event format

Durable Streams uses a standardized change event format compatible with the State Protocol:
{
  "headers": {
    "operation": "insert",
    "type": "todo"
  },
  "key": "todo-123",
  "value": {
    "id": "todo-123",
    "text": "Buy groceries",
    "completed": false
  }
}

Production features

Refresh-safe

Users can refresh the page, switch tabs, or background the app—they pick up exactly where they left off with no data loss.

Multi-device sync

Start on your phone, continue on your laptop—all devices stay in sync through the same durable stream.

Offline resilience

Clients track their offset locally. When they reconnect after being offline, they catch up on all missed changes.

CDN scaling

Offset-based URLs enable CDN caching and request collapsing. One origin server can serve millions of concurrent clients.

Authorization and filtering

Database streams should be filtered and authorized per-user on the server side before writing to the stream. Each user gets their own stream containing only the data they’re permitted to see.
// Server-side: Create filtered stream per user
const userStream = await DurableStream.create({
  url: `https://your-server.com/v1/stream/user/${userId}/data`,
  contentType: "application/json",
})

// Only append changes the user is authorized to see
for (const change of db.changes()) {
  if (await isAuthorized(userId, change)) {
    await userStream.append(change)
  }
}

Live vs catch-up modes

Durable Streams supports two consumption modes:
Read historical data and stop when up-to-date. Perfect for initial load or backfilling.
const res = await stream({
  url: streamUrl,
  offset: "-1", // Start from beginning
  live: false,  // Stop at current end
})

const allItems = await res.json()
console.log("Caught up with", allItems.length, "items")

Error handling and retry

The client automatically retries on network errors and 5xx responses with exponential backoff. For custom error handling (like refreshing auth tokens), use the onError callback.
const res = await stream({
  url: streamUrl,
  offset: lastSeenOffset,
  live: true,
  onError: async (error) => {
    if (error instanceof FetchError && error.status === 401) {
      // Refresh auth token and retry
      const newToken = await refreshAuthToken()
      return { headers: { Authorization: `Bearer ${newToken}` } }
    }
    // For other errors, propagate (don't return anything)
  },
})

Performance characteristics

From production use at Electric:
  • Sub-15ms latency - End-to-end delivery of database changes
  • Millions of concurrent clients - CDN caching enables massive scale without origin overload
  • Exactly-once delivery - Offset-based resumption guarantees no duplicates or gaps
  • Memory efficient - Streaming consumption means bounded memory usage regardless of data size

Next steps

State Protocol

Learn about the State Protocol for building synchronized applications

Client Libraries

Explore client libraries for TypeScript, Python, Go, and more

Protocol Reference

Deep dive into the Durable Streams protocol specification

Deployment Guide

Deploy Durable Streams to production with CDN configuration