Skip to main content

Quickstart

This 5-minute tutorial shows you how to grab a Substrate Squid SDK indexer template. At the end you’ll have a complete blockchain indexer that fetches, decodes, and serves KSM transfers data from Kusama.

What you’ll get

Your Substrate indexer (squid) will:
  • Fetch all historical KSM transfers on Kusama
  • Save the data to a local PostgreSQL database
  • Start a GraphQL server with a rich API to query the indexed transfers
This tutorial focuses on Substrate. For other chains, see the EVM Quickstart or Solana Quickstart.

Prerequisites

Before you begin, ensure you have:
1

(Optional) Install Squid CLI

Install the Squid CLI globally:
npm i -g @subsquid/cli
Verify installation by running sqd --version
Squid CLI is a multi-purpose utility tool for scaffolding and managing indexers, both locally and in SQD Cloud.
2

Scaffold the indexer project

Create a new squid project from the substrate template:
sqd init hello-squid -t substrate
cd hello-squid
or, if you skipped the installation of Squid CLI
git clone https://github.com/subsquid-labs/squid-substrate-template hello-squid
cd hello-squid
3

Inspect the project structure

Explore the project structure:
src/
├── main.ts
├── model
│   ├── generated
│   │   ├── account.model.ts
│   │   ├── index.ts
│   │   ├── marshal.ts
│   │   └── transfer.model.ts
│   └── index.ts
├── processor.ts
└── types
    ├── balances
    │   └── events.ts
    ├── events.ts
    ├── index.ts
    ├── support.ts
    ├── v1020.ts
    ├── v1050.ts
    └── v9130.ts
Key files explained: - src/types - Utility modules generated for interfacing the balances pallet data. - src/model/ - TypeORM model classes used in database operations - processor.ts - Data retrieval configuration - main.ts - Main executable containing processing logic
The processor.ts file defines the SubstrateBatchProcessor object and configures data retrieval:
processor.ts
export const processor = new SubstrateBatchProcessor()
  // SQD gateway is the faster source of data, but it is optional
  // Not supplying it will cause all data to be ingested from RPC (slow)
  .setGateway('https://v2.archive.subsquid.io/network/kusama')
  // Chain RPC endpoint is required on Substrate for metadata and real-time updates
  .setRpcEndpoint({
    // Set via .env for local runs or via secrets when deploying to Subsquid Cloud
    // https://docs.subsquid.io/deploy-squid/env-variables/
    url: assertNotNull(process.env.RPC_KUSAMA_HTTP, 'No RPC endpoint supplied'),
    // More RPC connection options at https://docs.subsquid.io/substrate-indexing/setup/general/#set-data-source
    rateLimit: 10
  })
  .addEvent({
    name: [events.balances.transfer.name],
    extrinsic: true
  })
  .setFields({
    event: {
      args: true
    },
    extrinsic: {
      hash: true,
      fee: true
    },
    block: {
      timestamp: true
    }
  })
Next, main.ts defines the data processing and storage logic. Data processing is defined in the batch handler, the callback that the processor.run() main call receives as its final argument:
main.ts
processor.run(new TypeormDatabase({supportHotBlocks: true}), async (ctx) => {
  let transferEvents: TransferEvent[] = getTransferEvents(ctx)

  let accounts: Map<string, Account> = await createAccounts(ctx, transferEvents)
  let transfers: Transfer[] = createTransfers(transferEvents, accounts)

  await ctx.store.upsert([...accounts.values()])
  await ctx.store.insert(transfers)
})
See the full file for details.
4

Install dependencies and build

Install dependencies and build the project:
npm i
npm run build
Verify the build completed successfully by checking for the lib/ directory.
5

Start the database and processor

The processor continuously fetches data, decodes it, and stores it in PostgreSQL. All logic is defined in main.ts and is fully customizable.First, start a local PostgreSQL database (the template includes a Docker Compose file):
docker compose up -d
The processor connects to PostgreSQL using connection parameters from .env. Ensure the database is running before proceeding.
Apply database migrations:
npx squid-typeorm-migration apply
Then start the processor:
node -r dotenv/config lib/main.js
The indexer is now running and will begin processing blocks.
6

Start the GraphQL API

Start the GraphQL API to serve the transfer data:
npx squid-graphql-server
7

Query the data

You can now query your indexed data! Check it out at the GraphiQL playground at localhost:4350/graphql.
Last modified on December 17, 2025