Use this file to discover all available pages before exploring further.
Durable Streams provides a natural foundation for event-sourced architectures, offering durable append-only logs with offset-based replay and exactly-once write semantics.
While backend event stores like Kafka excel at server-to-server event processing, they don’t extend cleanly to client applications. Durable Streams bridges this gap:
HTTP-native - Works in browsers, mobile apps, IoT devices—anywhere HTTP works
Offset-based replay - Resume from any point in the event stream
Read all events from the beginning to reconstruct current state. This is the core of event sourcing.
import { stream } from "@durable-streams/client"// Replay from beginningconst res = await stream({ url: `https://your-server.com/v1/stream/orders/${orderId}/events`, offset: "-1", // Start from beginning live: false, // Catch-up only})const events = await res.json()const state = events.reduce(applyEvent, initialState)console.log("Current order state:", state)
3
Subscribe to new events
After catching up, subscribe to new events in real-time to keep state synchronized.
// Replay then live tailconst res = await stream({ url: `https://your-server.com/v1/stream/orders/${orderId}/events`, offset: "-1", live: true, // Continue after catching up})let state = initialStateres.subscribeJson(async (batch) => { for (const event of batch.items) { state = applyEvent(state, event) } saveCheckpoint(batch.offset)})
Organize events by aggregate (e.g., one stream per order, user, or shopping cart):
// Each order has its own event streamconst orderStream = await DurableStream.create({ url: `https://your-server.com/v1/stream/orders/${orderId}/events`, contentType: "application/json",})// Each user has their own event streamconst userStream = await DurableStream.create({ url: `https://your-server.com/v1/stream/users/${userId}/events`, contentType: "application/json",})// Workspace-level eventsconst workspaceStream = await DurableStream.create({ url: `https://your-server.com/v1/stream/workspaces/${workspaceId}/events`, contentType: "application/json",})
This pattern provides:
Isolation - Each aggregate’s events are independent
Performance - Replay only relevant events for an aggregate
Because events are immutable and replayable, you can reconstruct state at any point in time for debugging.
// Debug: What was the order state at timestamp X?async function getStateAtTime( orderId: string, targetTimestamp: number): Promise<OrderState> { const res = await stream({ url: `https://your-server.com/v1/stream/orders/${orderId}/events`, offset: "-1", live: false, }) const events = await res.json() // Filter events up to target time const eventsUntilTime = events.filter( e => e.timestamp <= targetTimestamp ) return eventsUntilTime.reduce(applyEvent, initialState)}// "What did this order look like yesterday at 3pm?"const stateYesterday = await getStateAtTime( orderId, Date.now() - 24 * 60 * 60 * 1000)