Skip to main content

Quickstart

This 5-minute tutorial shows you how to grab an EVM Squid SDK indexer template. At the end you’ll have a complete blockchain indexer that fetches, decodes, and serves Ethereum USDC transfers data.

What you’ll get

Your EVM indexer (squid) will:
  • Fetch all historical USDC transfers on Ethereum from the SQD Network
  • Decode the transfer events using contract ABIs
  • Save the data to a local PostgreSQL database
  • Start a GraphQL server with a rich API to query the historical USDC data
This tutorial focuses on EVM chains. For other chains, see the Solana Quickstart or Substrate Quickstart.

Prerequisites

Before you begin, ensure you have:
1

(Optional) Install Squid CLI

Install the Squid CLI globally:
npm i -g @subsquid/cli
Verify installation by running sqd --version
Squid CLI is a multi-purpose utility tool for scaffolding and managing indexers, both locally and in SQD Cloud.
2

Scaffold the indexer project

Create a new squid project from the USDC transfers example:
sqd init hello-squid -t https://github.com/subsquid-labs/showcase01-all-usdc-transfers
cd hello-squid
or, if you skipped the installation of Squid CLI
git clone https://github.com/subsquid-labs/showcase01-all-usdc-transfers hello-squid
cd hello-squid
3

Inspect the project structure

Explore the project structure:
src/
├── abi
│   └── usdc.ts
├── main.ts
└── model
    ├── generated
    │   ├── index.ts
    │   ├── marshal.ts
    │   └── usdcTransfer.model.ts
    └── index.ts
Key files explained: - src/abi/usdc.ts - Utility module generated from the USDC contract ABI for event decoding and RPC queries - src/model/ - TypeORM model classes auto-generated from schema.graphql for database operations - main.ts - Main executable containing data retrieval configuration and processing logic
The main.ts file first defines the EvmBatchProcessor object and configures its data retrieval options:
main.ts
const processor = new EvmBatchProcessor()
  // Portal is the primary source of blockchain data in
  // squids, providing pre-filtered data in chunks of roughly 1-10k blocks.
  // Set this for a fast sync.
  .setPortal("https://portal.sqd.dev/datasets/ethereum-mainnet")
  // Another data source squid processors can use is chain RPC.
  // In this particular squid it is used to retrieve the very latest chain data
  // (including unfinalized blocks) in real time. It can also be used to
  //   - make direct RPC queries to get extra data during indexing
  //   - sync a squid without a gateway (slow)
  .setRpcEndpoint('https://rpc.ankr.com/eth')
  // The processor needs to know how many newest blocks it should mark as "hot".
  // If it detects a blockchain fork, it will roll back any changes to the
  // database made due to orphaned blocks, then re-run the processing for the
  // main chain blocks.
  .setFinalityConfirmation(75)
  // .addXXX() methods request data items. In this case we're asking for
  // Transfer(address,address,uint256) event logs emitted by the USDC contract.
  //
  // We could have omitted the "address" filter to get Transfer events from
  // all contracts, or the "topic0" filter to get all events from the USDC
  // contract, or both to get all event logs chainwide. We also could have
  // requested some related data, such as the parent transaction or its traces.
  //
  // Other .addXXX() methods (.addTransaction(), .addTrace(), .addStateDiff()
  // on EVM) are similarly feature-rich.
  .addLog({
    range: { from: 6_082_465 },
    address: [USDC_CONTRACT_ADDRESS],
    topic0: [usdcAbi.events.Transfer.topic],
  })
  // .setFields() is for choosing data fields for the selected data items.
  // Here we're requesting hashes of parent transaction for all event logs.
  .setFields({
    log: {
      transactionHash: true,
    },
  })
Next, main.ts defines the data processing and storage logic. Data processing is defined in the batch handler, the callback that processor.run() accepts as its second argument:
main.ts
// TypeormDatabase objects store the data to Postgres. They are capable of
// handling the rollbacks that occur due to blockchain forks.
//
// There are also Database classes for storing data to files and BigQuery
// datasets.
const db = new TypeormDatabase({ supportHotBlocks: true });

// The processor.run() call executes the data processing. Its second argument is
// the handler function that is executed once on each batch of data. Processor
// object provides the data via "ctx.blocks". However, the handler can contain
// arbitrary TypeScript code, so it's OK to bring in extra data from IPFS,
// direct RPC calls, external APIs etc.
processor.run(db, async (ctx) => {
  // Making the container to hold that which will become the rows of the
  // usdc_transfer database table while processing the batch. We'll insert them
  // all at once at the end, massively saving IO bandwidth.
  const transfers: UsdcTransfer[] = [];

  // The data retrieved from Portal and/or the RPC endpoint
  // is supplied via ctx.blocks
  for (let block of ctx.blocks) {
    // On EVM, each block has four iterables - logs, transactions, traces,
    // stateDiffs
    for (let log of block.logs) {
      if (
        log.address === USDC_CONTRACT_ADDRESS &&
        log.topics[0] === usdcAbi.events.Transfer.topic
      ) {
        // SQD's very own EVM codec at work - about 20 times faster than ethers
        let { from, to, value } = usdcAbi.events.Transfer.decode(log);
        transfers.push(
          new UsdcTransfer({
            id: log.id,
            block: block.header.height,
            from,
            to,
            value,
            txnHash: log.transactionHash,
          })
        );
      }
    }
  }

  // Just one insert per batch!
  await ctx.store.insert(transfers);
});
4

Install dependencies and build

Install dependencies and build the project:
npm i
npm run build
Verify the build completed successfully by checking for the lib/ directory.
5

Start the database and processor

The processor continuously fetches data, decodes it, and stores it in PostgreSQL. All logic is defined in main.ts and is fully customizable.First, start a local PostgreSQL database (the template includes a Docker Compose file):
docker compose up -d
The processor connects to PostgreSQL using connection parameters from .env. Ensure the database is running before proceeding.
Apply database migrations:
npx squid-typeorm-migration apply
Then start the processor:
node -r dotenv/config lib/main.js
The indexer is now running and will begin processing blocks.
6

Start the GraphQL API

Start the GraphQL API to serve the transfer data:
npx squid-graphql-server
7

Query the data

You can now query your indexed data! Check it out at the GraphiQL playground at localhost:4350/graphql.
Last modified on December 17, 2025