Skip to main content

Documentation Index

Fetch the complete documentation index at: https://beta.docs.sqd.dev/llms.txt

Use this file to discover all available pages before exploring further.

The source component connects to SQD Portal and streams blockchain data to your pipeline. It’s the starting point for all Pipes SDK data flows.

substratePortalSource

Create a Portal source for Substrate chains. Parameters:
  • id: (required) Pipeline ID. Must be unique within any infra shared with other pipelines (DB, logging sinks etc).
  • portal: (required) Portal API URL or config object.
    • String: "https://portal.sqd.dev/datasets/ethereum-mainnet"
    • Object: { url: string, finalized?: boolean }. When finalized: true is set the stream will consist of finalized blocks only and none of the fork handling machinery will be required.
  • outputs: (required) A single query-transformers chain combo or record of named outputs.
  • cache: (optional) Portal cache instance. If supplied, saves portal responses locally and reuses them when the pipeline re-runs.
  • logger: (optional) A pino-compatible Logger instance or a log level string. Accepted level values: 'fatal', 'error', 'warn', 'info', 'debug', 'trace', 'silent', false, null. Passing false or null silences all log output. When omitted, a default console logger is used.
  • metrics: (optional) metricsServer() instance for exposing Prometheus metrics.
  • progress: (optional) Options for progress tracking.
  • profiler: (optional) Enable the built-in per-batch profiler. See Profiling.
Example:

Finalized Blocks

You can configure the source to only receive finalized blocks:
const source = substratePortalSource({
  portal: {
    finalized: true,
    url: 'https://portal.sqd.dev/datasets/-mainnet'
  }
});
Using finalized blocks eliminates the need for rollback handlers in your targets, simplifying the logic of your pipeline.

Pipe methods

pipe()

Chain a single whole-pipe transformer to the source.
source.pipe(transformer)
The returned value behaves exactly as the source. See also: Stateful transformers.

pipeTo()

Connect the pipeline to a target.
source.pipeTo(target)
This is a terminal operation: you cannot continue piping after calling this method. If you want your stream to resume on restarts and properly handle unfinalized data, make sure that the target manages cursors and handles forks correctly.

*Symbol.asyncIterator

Use the pipeline as an async iterator:
for await (const { data } of stream) {
  // ... do something with data ...
}
On blockchain forks this will throw ForkExceptions - see Fork handling.