Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/durable-streams/durable-streams/llms.txt

Use this file to discover all available pages before exploring further.

Quick Start

This guide will help you get started with Durable Streams in just a few minutes. You’ll learn how to create a stream, write data to it, and read it back.
This quick start uses the TypeScript client. For other languages, see the Installation guide.

Prerequisites

You’ll need:
  • Node.js 18 or later
  • A Durable Streams server (we’ll start one locally)

Step 1: Start the Local Server

The easiest way to get started is to run the local development server:
1

Clone the repository

git clone https://github.com/durable-streams/durable-streams.git
cd durable-streams
2

Install dependencies

pnpm install
3

Build the packages

pnpm build
4

Start the development server

pnpm start:dev
The server will start on http://localhost:4437
For production deployments, download the Caddy-based server binary from GitHub releases. Available for macOS, Linux, and Windows.

Step 2: Install the Client Library

In a new terminal, create a new project and install the client library:
mkdir my-durable-streams-app
cd my-durable-streams-app
npm init -y
npm install @durable-streams/client

Step 3: Create Your First Stream

Create a file called index.js:
index.js
import { DurableStream } from "@durable-streams/client"

const stream = await DurableStream.create({
  url: "http://localhost:4437/v1/stream/my-first-stream",
  contentType: "application/json",
})

console.log("Stream created at:", stream.url)
Run it:
node index.js
You should see:
Stream created at: http://localhost:4437/v1/stream/my-first-stream
Congratulations! You’ve created your first durable stream.

Step 4: Write Data to the Stream

Now let’s write some data to the stream. Update index.js:
import { DurableStream } from "@durable-streams/client"

const stream = await DurableStream.create({
  url: "http://localhost:4437/v1/stream/my-first-stream",
  contentType: "application/json",
})

// Append individual messages
await stream.append(JSON.stringify({ event: "user.created", userId: "123" }))
await stream.append(JSON.stringify({ event: "user.updated", userId: "123" }))
await stream.append(JSON.stringify({ event: "user.deleted", userId: "123" }))

console.log("Data written to stream")
Use IdempotentProducer for production applications. It provides:
  • Exactly-once delivery semantics
  • Automatic batching for high throughput
  • Network resilience (safe to retry)

Step 5: Read Data from the Stream

Now let’s read the data back. Create a new file called read.js:
import { DurableStream } from "@durable-streams/client"

const stream = new DurableStream({
  url: "http://localhost:4437/v1/stream/my-first-stream",
})

// Read all data (catch-up mode)
const res = await stream.stream({ live: false })
const items = await res.json()

console.log("Items:", items)
// Output: [
//   { event: 'user.created', userId: '123' },
//   { event: 'user.updated', userId: '123' },
//   { event: 'user.deleted', userId: '123' }
// ]
Run it:
node read.js

Step 6: Try Live Streaming

To see the real power of Durable Streams, let’s try live streaming:
  1. Terminal 1: Start reading (this will wait for new data)
live-reader.js
import { DurableStream } from "@durable-streams/client"

const stream = new DurableStream({
  url: "http://localhost:4437/v1/stream/my-first-stream",
})

const res = await stream.stream({ live: "auto" })

res.subscribeJson(async (batch) => {
  for (const item of batch.items) {
    console.log("Received:", item)
  }
})

console.log("Waiting for new data...")
node live-reader.js
  1. Terminal 2: Write new data
writer.js
import { DurableStream } from "@durable-streams/client"

const stream = new DurableStream({
  url: "http://localhost:4437/v1/stream/my-first-stream",
})

// Write messages every second
let count = 0
setInterval(async () => {
  count++
  await stream.append(JSON.stringify({ 
    event: "tick", 
    count,
    timestamp: new Date().toISOString() 
  }))
  console.log(`Sent message ${count}`)
}, 1000)
node writer.js
You should see the messages appear in Terminal 1 in real-time!
You now have a working live streaming setup!

Using the CLI

For quick testing and exploration, you can use the CLI:
1

Install the CLI globally

npm install -g @durable-streams/cli
Or run directly with npx:
npx @durable-streams/cli
2

Create a stream

STREAM_URL=http://localhost:4437 \
durable-stream create test-stream
3

Write data

# Write directly
durable-stream write test-stream "Hello, world!"

# Pipe from stdin
echo '{"message": "hello"}' | durable-stream write test-stream --json
4

Read data

durable-stream read test-stream

Complete Example: AI Token Streaming

Here’s a real-world example showing how to stream LLM tokens with resume capability:
import { DurableStream, IdempotentProducer } from "@durable-streams/client"
import OpenAI from "openai"

const openai = new OpenAI()

// Create stream for this generation
const generationId = crypto.randomUUID()
const stream = await DurableStream.create({
  url: `http://localhost:4437/v1/stream/generation/${generationId}`,
  contentType: "text/plain",
})

// Use idempotent producer for reliable token delivery
const producer = new IdempotentProducer(stream, generationId, {
  autoClaim: true,
  lingerMs: 10, // Send batches every 10ms for low latency
  onError: (err) => console.error("Failed to send tokens:", err),
})

// Stream tokens from OpenAI
const completion = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: "Tell me a story" }],
  stream: true,
})

for await (const chunk of completion) {
  const token = chunk.choices[0]?.delta?.content
  if (token) {
    producer.append(token)
  }
}

// Close the stream when done
await producer.close()

console.log("Generation complete. Stream URL:", stream.url)
What this gives you:
  • Tab suspended? User comes back and catches up from saved offset
  • Page refresh? Continues from last token, not from the beginning
  • Share the generation? Multiple viewers watch the same stream in real-time
  • Switch devices? Start on mobile, continue on desktop

Testing with the Test UI

The repository includes a visual web interface for testing:
# Terminal 1: Start the server (if not already running)
pnpm start:dev

# Terminal 2: Start the Test UI
cd examples/test-ui
pnpm dev
Open http://localhost:3000 to:
  • Create and manage streams
  • Write messages with keyboard shortcuts
  • Monitor real-time stream updates
  • View the stream registry
  • Inspect stream metadata

Next Steps

Installation Guide

Install client libraries for Python, Go, and other languages

Protocol Specification

Learn about the complete protocol

API Reference

Explore the full API documentation

Examples

See more example applications

Common Patterns

Pass authentication headers to the stream handle:
const stream = new DurableStream({
  url: "https://your-server.com/v1/stream/my-stream",
  headers: {
    Authorization: `Bearer ${token}`,
  },
})
Headers are sent with every request.
Use IdempotentProducer which batches automatically:
const producer = new IdempotentProducer(stream, "my-producer", {
  lingerMs: 5,          // Wait 5ms before sending
  maxBatchBytes: 65536, // Or send when batch reaches 64KB
})

// These get batched into fewer HTTP requests
for (const event of events) {
  producer.append(JSON.stringify(event))
}

await producer.flush()
The client includes automatic retry with exponential backoff:
const stream = new DurableStream({
  url: "...",
  backoffOptions: {
    maxRetries: 5,
    initialDelay: 100,
    maxDelay: 10000,
  },
})
IdempotentProducer is safe to retry - the server deduplicates.
Yes! Here’s a React example:
import { useEffect, useState } from "react"
import { DurableStream } from "@durable-streams/client"

function StreamComponent() {
  const [items, setItems] = useState([])

  useEffect(() => {
    const stream = new DurableStream({ url: "..." })
    
    const start = async () => {
      const res = await stream.stream({ live: "auto" })
      res.subscribeJson(async (batch) => {
        setItems(prev => [...prev, ...batch.items])
      })
    }

    start()
  }, [])

  return (
    <ul>
      {items.map((item, i) => <li key={i}>{JSON.stringify(item)}</li>)}
    </ul>
  )
}