Documentation Index Fetch the complete documentation index at: https://mintlify.com/durable-streams/durable-streams/llms.txt
Use this file to discover all available pages before exploring further.
Durable Streams operates at the byte level, but different content types handle message boundaries differently. Understanding message framing is crucial for correctly reading and writing structured data.
Content types overview
When you create a stream, you specify a MIME content type that determines how data is framed:
// JSON mode - automatic message boundary preservation
const jsonStream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/json'
})
// Byte mode - raw binary data
const byteStream = await DurableStream . create ({
url: 'https://streams.example.com/logs' ,
contentType: 'text/plain'
})
JSON mode (application/json)
JSON mode provides automatic message boundary preservation . Each append operation stores messages as distinct units, and reads return data as JSON arrays.
Message boundaries
The server preserves message boundaries automatically. You append individual JSON values, and they’re stored and retrieved as separate messages.
import { DurableStream } from '@durable-streams/client'
const stream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/json'
})
// Append individual messages (pre-serialized JSON strings)
await stream . append ( JSON . stringify ({ event: 'created' , id: 1 }))
await stream . append ( JSON . stringify ({ event: 'updated' , id: 2 }))
// Read returns array of messages
const response = await stream . stream ()
const messages = await response . json ()
console . log ( messages )
// [{ event: 'created', id: 1 }, { event: 'updated', id: 2 }]
Array flattening for batching
When you POST a JSON array, the server flattens exactly one level , treating each element as a separate message:
POST body: {"event": "created"}
Stores: 1 message → {"event": "created"}
POST body: [{"event": "a"}, {"event": "b"}]
Stores: 2 messages → {"event": "a"}, {"event": "b"}
POST body: [[1,2], [3,4]]
Stores: 2 messages → [1,2], [3,4]
POST body: [[[1,2,3]]]
Stores: 1 message → [[1,2,3]]
Client libraries may automatically wrap individual values in arrays for batching. For example, calling append({"x": 1}) might send [{"x": 1}] to the server, which flattens it to store one message: {"x": 1}.
Empty arrays
Servers reject POST requests with empty JSON arrays ([]) with 400 Bad Request. Empty arrays represent no-op operations with no semantic meaning and likely indicate a client bug.
// ❌ Invalid: Empty array
await stream . append ( '[]' ) // Throws 400 Bad Request
// ✅ Valid: Create empty stream
const stream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/json' ,
body: '[]' // OK for PUT (creates empty stream)
})
Batching multiple messages
You can efficiently batch multiple messages in a single HTTP request:
// Client automatically batches for efficiency
await stream . append ( JSON . stringify ({ id: 1 }))
await stream . append ( JSON . stringify ({ id: 2 }))
await stream . append ( JSON . stringify ({ id: 3 }))
// May be sent as: [{"id":1},{"id":2},{"id":3}]
// Server stores 3 separate messages
// Or explicitly batch yourself:
const batch = [
{ id: 1 , name: 'Alice' },
{ id: 2 , name: 'Bob' },
{ id: 3 , name: 'Charlie' }
]
await stream . append ( JSON . stringify ( batch ))
// Server stores 3 separate messages
JSON validation
Servers validate that appended data is valid JSON. Invalid JSON results in 400 Bad Request.
// ❌ Invalid JSON
await stream . append ( 'not valid json' ) // 400 Bad Request
// ✅ Valid JSON
await stream . append ( JSON . stringify ({ valid: true })) // OK
Byte mode (all other content types)
For all non-JSON content types, the protocol operates at the raw byte level . The server does not interpret message boundaries—you’re responsible for framing.
Raw concatenation
Multiple appends are simply concatenated as raw bytes. No delimiters or boundaries are added.
const stream = await DurableStream . create ({
url: 'https://streams.example.com/logs' ,
contentType: 'text/plain'
})
// Three separate appends
await stream . append ( 'Hello ' )
await stream . append ( 'world' )
await stream . append ( '!' )
// Read returns concatenated bytes
const response = await stream . stream ()
const text = await response . text ()
console . log ( text ) // "Hello world!"
Newline-delimited JSON (NDJSON)
For NDJSON streams, you manually add newlines:
const stream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/x-ndjson'
})
// Manually add newlines for message boundaries
await stream . append ( JSON . stringify ({ event: 'a' }) + ' \n ' )
await stream . append ( JSON . stringify ({ event: 'b' }) + ' \n ' )
// Read and parse line-by-line
const response = await stream . stream ()
for await ( const chunk of response . textStream ()) {
const lines = chunk . split ( ' \n ' ). filter ( line => line . trim ())
for ( const line of lines ) {
const event = JSON . parse ( line )
console . log ( event )
}
}
Protocol Buffers
For binary formats like Protocol Buffers, include length prefixes:
import { DurableStream } from '@durable-streams/client'
const stream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/x-protobuf'
})
// Encode message with length prefix
function encodeMessage ( proto : Uint8Array ) : Uint8Array {
const length = new Uint8Array ( 4 )
new DataView ( length . buffer ). setUint32 ( 0 , proto . length , false )
const result = new Uint8Array ( 4 + proto . length )
result . set ( length , 0 )
result . set ( proto , 4 )
return result
}
const message1 = MyProto . encode ({ field: 'value1' }). finish ()
const message2 = MyProto . encode ({ field: 'value2' }). finish ()
await stream . append ( encodeMessage ( message1 ))
await stream . append ( encodeMessage ( message2 ))
Common content types
application/json Use for : Structured events, API responses, state changes
Framing : Automatic message boundaries
Example : {"user": "alice", "action": "login"}
application/x-ndjson Use for : Streaming logs, bulk data export
Framing : Manual newlines
Example : {"level":"info"}\n{"level":"error"}\n
text/plain Use for : Log files, plain text data
Framing : Manual delimiters
Example : 2024-01-15 10:30:00 INFO Started\n
application/octet-stream Use for : Binary data, custom formats
Framing : Length-prefixed or custom
Example : Binary protocol buffers with length headers
Choosing a content type
Use JSON mode for structured data
If you’re sending structured events, objects, or API responses, use application/json:
// ✅ Good: Automatic message boundaries
const stream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/json'
})
await stream . append ( JSON . stringify ({ event: 'created' }))
await stream . append ( JSON . stringify ({ event: 'updated' }))
Use NDJSON for line-oriented data
For logs or data where newlines naturally separate records:
const stream = await DurableStream . create ({
url: 'https://streams.example.com/logs' ,
contentType: 'application/x-ndjson'
})
await stream . append ( JSON . stringify ({ timestamp: Date . now () }) + ' \n ' )
Use binary for custom protocols
For maximum efficiency or existing binary protocols:
const stream = await DurableStream . create ({
url: 'https://streams.example.com/telemetry' ,
contentType: 'application/octet-stream'
})
// Your custom framing protocol
await stream . append ( encodeCustomFormat ( data ))
Best practices
JSON mode is recommended for most use cases. It provides automatic message boundary preservation, making it easier to work with structured data without implementing custom framing.
Never mix content types on the same stream. Appends must match the stream’s configured content type, or the server returns 409 Conflict.
const stream = await DurableStream . create ({
url: 'https://streams.example.com/events' ,
contentType: 'application/json'
})
// ✅ Good: Matches stream content type
await stream . append ( JSON . stringify ({ event: 'created' }))
// ❌ Wrong: Different content type
await stream . append ( 'plain text' ) // 409 Conflict
Reading chunked data
Servers may return data in chunks for performance. The client libraries handle reassembly automatically:
const response = await stream . stream <{ event : string }>()
// Automatic chunk handling with backpressure
response . subscribeJson ( async ( batch ) => {
// batch.items contains complete messages
// even if they arrived across multiple HTTP chunks
for ( const item of batch . items ) {
console . log ( item . event )
}
})
Next steps
Live Modes Learn about real-time streaming with long-polling and SSE
Idempotent Producers Implement exactly-once write semantics
Caching and Fanout Optimize performance with CDN caching