Build a custom data sink. A target drains batches from the pipe and is responsible for persisting them.Documentation Index
Fetch the complete documentation index at: https://beta.docs.sqd.dev/llms.txt
Use this file to discover all available pages before exploring further.
write: (required) Async function({ read, logger }) => Promise<void>. Iterate the stream by callingread()and consuming{ data, ctx }batches. The function returns when the stream ends.fork: (optional)(previousBlocks: BlockCursor[]) => Promise<BlockCursor | null>. Called when the source detects a chain reorg. Return the last safe cursor to roll back to, ornullif no common ancestor can be determined (the stream will throw). See Fork handling. You don’t need this callback when the source is configured to read only finalized blocks.
The write context
| Field | Description |
|---|---|
read | Opens an async iterator over pipeline batches. Pass cursor to resume from a specific the target has persisted. |
logger | Pino-compatible logger scoped to this target. |
Per-batch context (ctx)
Each { data, ctx } yielded by read() carries the same BatchContext that transformers receive. Fields:
| Field | Type | Description |
|---|---|---|
id | string | Pipeline ID — the id passed to substratePortalStream(). |
logger | Logger | Batch-scoped logger. |
metrics | Metrics | Prometheus metrics registry. See Metrics. |
profiler | Profiler | Open a span with ctx.profiler.start('label'). See Profiling. |
stream.dataset | ApiDataset | Dataset metadata. |
stream.head.finalized | BlockCursor | undefined | Current finalized head. |
stream.head.latest | BlockCursor | undefined | Current unfinalized head. |
stream.state.initial | number | First the stream was configured to read. |
stream.state.last | number | Last the stream intends to read. |
stream.state.current | BlockCursor | Latest in this batch. |
stream.state.rollbackChain | BlockCursor[] | Tail of unfinalized cursors subject to rollback. |
stream.progress | ProgressEvent['progress'] | Progress metrics when progress is enabled. |
stream.query | { url, hash, raw } | Portal query details for the batch. |
batch.blocksCount | number | Number of in this batch. |
batch.bytesSize | number | Compressed payload size received from the portal. |
batch.requests | Record<number, number> | Map of HTTP status code → count of responses that produced this batch. |
batch.lastBlockReceivedAt | Date | Wall-clock time the last block was received. |
