Documentation Index
Fetch the complete documentation index at: https://beta.docs.sqd.dev/llms.txt
Use this file to discover all available pages before exploring further.
Using with AI
The fastest way to get an AI coding agent productive on a Pipes SDK project is to install the official Pipes SDK Agent Skill:- Portal MCP server — 29 tools for querying blocks, transactions, logs, instructions, and analytics across 225+ datasets. No API key.
- Documentation MCP server — search and retrieve these docs from inside the agent.
llms.txt (index) and llms-full.txt (full content) files are kept in sync with the site. See the AI Development overview for the full menu.
Scaffolding with Pipes CLI
pipes-cli is a work in progress.Prerequisites
- Node.js 22.15+
pnpm- Docker (for the bundled PostgreSQL container)
Initialize the project
Run the CLI in the directory where you want the project folder to land:pnpm for now), sink (please use ClickHouse or Postgres), network type, network, and template; then installs dependencies and writes a runnable project.
You can supply a JSON config instead of filling the prompts manually. Here’s the configuration for mentioned above:
--config also accepts a path to a JSON file.
To inspect the full config schema run
Run the pipeline
The generated project ships with adocker-compose.yml that brings up the sink database and the pipeline together:
For an iterative dev loop, run the database in Docker and the pipeline locally:
What was generated
The project layout:src/index.ts. The decoder block defines what to extract + a light transform:
The main() function wires the decoder to a drizzleTarget:
The id is a per-pipeline identifier — keep it stable so the target’s cursor survives restarts. See substratePortalSource for the full source API and Pipe anatomy for how the pieces fit together.
