Skip to main content

Quickstart

This 5-minute tutorial shows you how to grab a Starknet Squid SDK indexer template. At the end you’ll have a complete blockchain indexer that fetches, stores and serves Starknet USDC transfers.

What you’ll get

Your Starknet indexer (squid) will:
  • Fetch all historical USDC transfers from the SQD Network
  • Save the data to a local PostgreSQL database
  • Start a GraphQL server with an API to query the historical USDC data
This tutorial focuses on Starknet. For other chains, see the EVM Quickstart or Solana Quickstart.

Prerequisites

Before you begin, ensure you have:
1

(Optional) Install Squid CLI

Install the Squid CLI globally:
npm i -g @subsquid/cli
Verify installation by running sqd --version
Squid CLI is a multi-purpose utility tool for scaffolding and managing indexers, both locally and in SQD Cloud.
2

Scaffold the indexer project

Create a new squid project from the Starknet example:
sqd init hello-squid -t https://github.com/subsquid-labs/starknet-example
cd hello-squid
or, if you skipped the installation of Squid CLI
git clone https://github.com/subsquid-labs/starknet-example hello-squid
cd hello-squid
3

Inspect the project structure

Explore the project structure:
src/
├── main.ts
└── model
    ├── generated
    │   ├── index.ts
    │   ├── marshal.ts
    │   └── transfer.model.ts
    └── index.ts
Key files explained: - src/model/ - TypeORM model classes used in database operations - main.ts - Main executable containing data retrieval settings and processing logic
First, the main.ts file defines the data source object and configures data retrieval:
main.ts
// First we create a DataSource - component
// that defines where to get the data and what data should we get.
const dataSource = new DataSourceBuilder()
  // Provide Subsquid Network Gateway URL.
  .setGateway('https://v2.archive.subsquid.io/network/starknet-mainnet')
  // Subsquid Network is always a few thousand blocks behind the head.
  // We must use regular RPC endpoint to get through the last mile
  // and stay on top of the chain.
  // This is a limitation, and we promise to lift it in the future!
  .setRpc(process.env.STARKNET_NODE == null ? undefined : {
    client: new StarknetRpcClient({
      url: process.env.STARKNET_NODE,
      // rateLimit: 100 // requests per sec
    }),
    strideConcurrency: 10
  })
  // Starting from the block where the USDC contract was deployed
  .setBlockRange({from: 3135})
  //
  // Block data returned by the data source has the following structure:
  //
  // interface Block {
  //   header: BlockHeader
  //   transactions: Transaction[]
  //   events: Event[]
  // }
  //
  // For each block item we can specify a set of fields we want to fetch via `.setFields()` method.
  // Think about it as of SQL projection.
  //
  // Accurate selection of only required fields can have a notable positive impact
  // on performance when data is sourced from Subsquid Network.
  //
  // It is possible to override default selection by setting undesired fields to `false`.
  .setFields({
    block: { // block header fields
      timestamp: true
    },
    transaction: { // transaction fields
      transactionHash: true
    },
    event: {
      fromAddress: true,
      data: true
    }
  })
  // By default, block can be skipped if it doesn't contain explicitly requested items.
  //
  // We request items via `.addXxx()` methods.
  //
  // Each `.addXxx()` method accepts item selection criteria
  // and also allows to request related items.
  //
  .addEvent({
    fromAddress: [USDC_ADDRESS],
    transaction: true // fetch transactions that gave rise to the selected events
  })
  .build()
Next, main.ts defines the data processing and storage logic. Data processing is defined in the batch handler, the callback that the run() function receives as its final argument:
main.ts
// Below we create a `TypeormDatabase`.
//
// It provides restricted subset of [TypeORM EntityManager API](https://typeorm.io/working-with-entity-manager)
// as a persistent storage interface and works with any Postgres-compatible database.
//
// Note, that we don't pass any database connection parameters.
// That's because `TypeormDatabase` expects a certain project structure
// and environment variables to pick everything it needs by convention.
// Companion `@subsquid/typeorm-migration` tool works in the same way.
//
// For full configuration details please consult
// https://github.com/subsquid/squid-sdk/blob/278195bd5a5ed0a9e24bfb99ee7bbb86ff94ccb3/typeorm/typeorm-config/src/config.ts#L21
const database = new TypeormDatabase()

// Now we are ready to start data processing
run(dataSource, database, async ctx => {
  let transfers: Transfer[] = []

  for (let block of ctx.blocks) {
    for (let event of block.events) {
      transfers.push(new Transfer({
        id: `${block.header.height}-${event.transactionIndex}-${event.eventIndex}`,
        block: block.header.height,
        from: event.data[0],
        to: event.data[1],
        value: event.data[2] ? BigInt(event.data[2]) : undefined,
      }))
    }
  }

  await ctx.store.insert(transfers)
})
4

Install dependencies and build

Install dependencies and build the project:
npm i
npm run build
Verify the build completed successfully by checking for the lib/ directory.
5

Start the database and processor

The processor continuously fetches data, decodes it, and stores it in PostgreSQL. All logic is defined in main.ts and is fully customizable.First, start a local PostgreSQL database (the template includes a Docker Compose file):
docker compose up -d
The processor connects to PostgreSQL using connection parameters from .env. Ensure the database is running before proceeding.
Apply database migrations:
npx squid-typeorm-migration apply
Then start the processor:
node -r dotenv/config lib/main.js
The indexer is now running and will begin processing blocks.
6

Start the GraphQL API

Start the GraphQL API to serve the transfer data:
npx squid-graphql-server
7

Query the data

You can now query your indexed data! Check it out at the GraphiQL playground at localhost:4350/graphql.
Last modified on December 17, 2025