Offsets are the foundation of Durable Streams’ resumability. They allow you to pick up exactly where you left off after a disconnect, refresh, or app restart.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/durable-streams/durable-streams/llms.txt
Use this file to discover all available pages before exploring further.
What is an offset?
An offset is an opaque token that identifies a specific position within a stream. Think of it like a bookmark—you can use it to resume reading from that exact point.Offset properties
Offsets have five key properties:You must not interpret the offset structure or meaning. Always use offsets as-is, exactly as returned by the server.
// ✅ Correct: Use offset as-is
const offset = response.offset
await stream({ url, offset })
// ❌ Wrong: Don't parse or manipulate offsets
const parts = offset.split('_')
const modified = `${parts[0]}_${Number(parts[1]) + 1}`
For any two valid offsets from the same stream, lexicographic comparison determines their relative position.
const offset1 = '100_1234'
const offset2 = '100_5678'
// You can compare offsets to determine order
if (offset1 < offset2) {
console.log('offset1 comes before offset2')
}
You can save offsets to a database, localStorage, or any persistent storage and use them days or weeks later.
Each offset identifies exactly one position in the stream. No two positions may share the same offset.
Special offset values
The protocol defines two special sentinel values:-1 (Stream beginning)
The offset -1 represents the beginning of the stream. Use this to read from the start.
now (Current tail)
The offset now allows you to skip all existing data and begin reading from the current tail position. This is useful for applications that only care about future data.
The
now offset is particularly useful for:- Presence tracking (only care about future joins/leaves)
- Live monitoring (skip historical data)
- Late joiners to a conversation
- Real-time dashboards
Reading from an offset
When you read from a stream, you specify an offset using theoffset query parameter:
- Data: Bytes starting from that offset
- Stream-Next-Offset header: The offset for the next read
Automatic offset tracking
Client libraries automatically track offsets for you during live streaming:Resumability patterns
Pattern 1: Checkpoint on batch
Save the offset after processing each batch:Pattern 2: Resume from saved offset
Restart from your last checkpoint:Pattern 3: Multi-tab coordination
Share offsets across browser tabs using localStorage:Offset format
While offsets are opaque, understanding the general format helps with debugging: Common format:<read-seq>_<byte-offset>
Example: 100_5678
100: Read sequence number (chunks served)5678: Byte offset in the stream
Server-side optimizations
The opaque nature of offsets enables important server-side optimizations: For example, an offset might encode:- Which chunk file contains the data
- The position within that chunk file
- Metadata for efficient retrieval
Best practices
// ✅ Good: Save offsets regularly
response.subscribeJson(async (batch) => {
await processData(batch.items)
await db.saveOffset(streamId, batch.offset)
})
// ❌ Bad: No offset tracking
response.subscribeJson(async (batch) => {
await processData(batch.items)
// Lost: Can't resume after disconnect
})
// ✅ Good: Use returned offset
const res1 = await stream({ url, offset: '-1' })
const res2 = await stream({ url, offset: res1.offset })
// ❌ Bad: Try to calculate next offset
const res1 = await stream({ url, offset: '-1' })
const calculated = calculateNextOffset(res1.offset) // Don't do this!
Next steps
Message Framing
Learn how content types preserve message boundaries
Live Modes
Explore real-time streaming with long-polling and SSE
Caching and Fanout
Understand CDN caching for massive scale