Use this file to discover all available pages before exploring further.
Durable Streams provides a production-proven solution for streaming database changes to client applications. Built on 1.5 years of production use at Electric for real-time Postgres sync, it reliably delivers millions of state changes every day.
Your backend captures database changes and appends them to a durable stream. Each client gets their own stream filtered and authorized for their access level.
import { DurableStream } from "@durable-streams/client"// Create a stream for this client's authorized dataconst stream = await DurableStream.create({ url: `https://your-server.com/v1/stream/user/${userId}/data`, contentType: "application/json",})// Listen to database changes and append to streamfor (const change of db.changes()) { await stream.append(change) // JSON objects batched automatically}
2
Client: Receive and apply changes
Clients read from their stream starting from their last known offset. The stream delivers all changes since that point, then transitions to live updates.
import { stream } from "@durable-streams/client"// Resume from last seen position (stored locally)const res = await stream({ url: `https://your-server.com/v1/stream/user/${userId}/data`, offset: lastSeenOffset, // Or "-1" to start from beginning live: true, // Continue with live updates after catch-up})// Subscribe to changes as they arriveres.subscribeJson(async (batch) => { for (const change of batch.items) { applyChange(change) // Update local state } // Save offset for resumption after disconnect saveOffset(batch.offset)})
3
Handle reconnections gracefully
When the client reconnects (after page refresh, network switch, etc.), they resume from their saved offset. No data is lost, no complex state reconciliation needed.
// After page refresh or reconnectconst res = await stream({ url: `https://your-server.com/v1/stream/user/${userId}/data`, offset: getSavedOffset(), // Pick up exactly where we left off live: true,})
Database streams should be filtered and authorized per-user on the server side before writing to the stream. Each user gets their own stream containing only the data they’re permitted to see.
// Server-side: Create filtered stream per userconst userStream = await DurableStream.create({ url: `https://your-server.com/v1/stream/user/${userId}/data`, contentType: "application/json",})// Only append changes the user is authorized to seefor (const change of db.changes()) { if (await isAuthorized(userId, change)) { await userStream.append(change) }}
Read historical data and stop when up-to-date. Perfect for initial load or backfilling.
const res = await stream({ url: streamUrl, offset: "-1", // Start from beginning live: false, // Stop at current end})const allItems = await res.json()console.log("Caught up with", allItems.length, "items")
Catch up on historical data, then seamlessly transition to real-time updates.
const res = await stream({ url: streamUrl, offset: lastSeenOffset, live: true, // Continue after catching up})res.subscribeJson(async (batch) => { // Receives both catch-up and live data for (const change of batch.items) { applyChange(change) } saveOffset(batch.offset)})
The client automatically retries on network errors and 5xx responses with exponential backoff. For custom error handling (like refreshing auth tokens), use the onError callback.