Skip to main content
Cursors track the last successfully processed block, allowing pipelines to resume after restarts without reprocessing historical data.
Resume indexing from a specific block using cursors:
const target = createTarget({
  write: async ({logger, read}) => {
    // Resume from block 20_000_300
    for await (const {data} of read({ number: 20_000_300 })) {
      console.log('Processing data from block:', data)
    }
  }
})
Pass a cursor to read() to resume processing from a specific block. The cursor format is { number: blockNumber }. This allows you to restart your indexer without reprocessing all historical data.
Store the cursor after successfully processing each batch to enable resumption. Use a database or file-based storage for persistence across restarts.
Cursor updates are atomic. Either the entire block batch succeeds and the cursor advances, or the cursor doesn’t advance. Never manually modify cursor position without ensuring you don’t create gaps in processed blocks.

View full example on GitHub

Complete working code with cursor management