Skip to main content
This page is a definitive end-to-end guide into practical squid development. It uses templates to simplify the process. Check out Squid from scratch for a more educational barebones approach.
Feel free to also use the template-specific sqd scripts defined in commands.json to simplify your workflow. See sqd CLI cheatsheet for a short intro.

Prepare the environment

  • Node v20.x or newer
  • Git
  • Squid CLI
  • Docker (if your squid will store its data to PostgreSQL)
See also the Environment set up page.

Understand your technical requirements

Consider your business requirements and find out
  1. How the data should be delivered. Options:
  2. What data should be delivered
  3. What Substrate-based chain you’re indexing - see supported networks Note that you can use SQD via RPC ingestion even if your network is not listed.
  4. What exact data should be retrieved: events, extrinsics (calls), or storage items
  5. Whether you need to mix in any off-chain data

Start from a template {#templates}

Although it is possible to compose a squid from individual packages, in practice it is usually easier to start from a template.

Substrate template

A minimal template intended for developing Substrate squids:
sqd init my-squid-name -t substrate
After retrieving the template install its dependencies:
cd my-squid-name
npm i
Test the template locally:
1

Launch a PostgreSQL container

docker compose up -d
2

Build the squid

bash npm run build
3

Apply the DB migrations

bash npx squid-typeorm-migration apply
4

Start the squid processor

node -r dotenv/config lib/main.js
You should see output that contains lines like these ones:
04:11:24 INFO  sqd:processor processing blocks from 6000000
04:11:24 INFO  sqd:processor using archive data source
04:11:24 INFO  sqd:processor prometheus metrics are served at port 45829
04:11:27 INFO  sqd:processor 6051219 / 18079056, rate: 16781 blocks/sec, mapping: 770 blocks/sec, 544 items/sec, eta: 12m
5

Start the GraphQL server

Run the following command in a separate terminal:
npx squid-graphql-server
Then visit the GraphiQL console to verify that the GraphQL API is up.
When done, shut down and erase your database with docker compose down.

The bottom-up development cycle {#bottom-up-development}

The advantage of this approach is that the code remains buildable at all times, making it easier to catch issues early.

I. Generate type-safe interfaces {#typegen}

For Substrate chains, generate TypeScript wrappers for events, calls, and storage:
npx squid-substrate-typegen typegen.json
The typegen.json file specifies which events, calls, and storage items to generate types for. See the Substrate typegen documentation for details. The generated classes will become available in src/types.

II. Configure the data requests {#processor-config}

Data requests are customarily defined at src/processor.ts. Edit the definition of const processor to:
  1. Set up data sources:
  2. Request the specific events, calls, and storage items your squid needs.
  3. Select all data fields necessary for your task.
See reference documentation for more info.

III. Decode and normalize the data {#batch-handler-decoding}

Next, change the batch handler to decode and normalize your data. In templates, the batch handler is defined at the processor.run() call in src/main.ts as an inline function. Its sole argument ctx contains:
  • at ctx.blocks: all the requested data for a batch of blocks
  • at ctx.store: the means to save the processed data
  • at ctx.log: a Logger
  • at ctx.isHead: a boolean indicating whether the batch is at the current chain head
  • at ctx._chain: the means to access RPC for state queries
Use the type-safe wrappers from the typegen step to decode events and calls. See Substrate batch context reference.

IV. Prepare the store {#store}

At src/main.ts, change the Database object definition to accept your output data.
1

Define the database schema

Define the schema of the database at schema.graphql.
2

Regenerate the TypeORM model classes

npx squid-typeorm-codegen
The classes will become available at src/model.
3

Compile the models code

bash npm run build
4

Ensure access to a blank database

The easiest way to do so is to start PostgreSQL in a Docker container with:
docker compose up -d
If the container is running, stop it and erase the database with:
docker compose down
before issuing a docker compose up -d.
5

Regenerate a migration

rm -r db/migrations
npx squid-typeorm-migration generate
You can now use the async functions ctx.store.upsert() and ctx.store.insert() to access the database.

V. Persist the transformed data {#batch-handler-persistence}

For each batch, create all the instances of all TypeORM model classes at once, then save them with the minimal number of calls to upsert() or insert(). See the Batch processing guide for patterns and anti-patterns.

Next steps