Skip to main content

Quickstart

This 5-minute tutorial shows you how to grab a Tron Squid SDK indexer template. At the end you’ll have a complete blockchain indexer that fetches, decodes, and serves Tron USDT transfers data.

What you’ll get

Your Tron indexer (squid) will:
  • Fetch all historical USDT transfers from the SQD Network
  • Decode the transfer events
  • Save the data to a local PostgreSQL database
  • Start a GraphQL server with an API to query the historical USDT data
This tutorial focuses on Tron. For other chains, see the EVM Quickstart or Solana Quickstart.

Prerequisites

Before you begin, ensure you have:
1

(Optional) Install Squid CLI

Install the Squid CLI globally:
npm i -g @subsquid/cli
Verify installation by running sqd --version
Squid CLI is a multi-purpose utility tool for scaffolding and managing indexers, both locally and in SQD Cloud.
2

Scaffold the indexer project

Create a new squid project from the Tron example:
sqd init hello-squid -t https://github.com/subsquid-labs/tron-example
cd hello-squid
or, if you skipped the installation of Squid CLI
git clone https://github.com/subsquid-labs/tron-example hello-squid
cd hello-squid
3

Inspect the project structure

Explore the project structure:
src/
├── abi
│   └── erc20.ts
├── main.ts
└── model
    ├── generated
    │   ├── index.ts
    │   ├── marshal.ts
    │   └── transfer.model.ts
    └── index.ts
Key files explained: - src/abi - Utility modules for interfacing ERC20 data. - src/model/ - TypeORM model classes used in database operations - main.ts - Main executable containing data retrieval settings and processing logic
First, the main.ts file defines the TronBatchProcessor object and configures data retrieval:
main.ts
const processor = new TronBatchProcessor()
  // Provide Subsquid Network Gateway URL.
  .setGateway('https://v2.archive.subsquid.io/network/tron-mainnet')
  // Subsquid Network is always about N blocks behind the head.
  // We must use regular HTTP API endpoint to get through the last mile
  // and stay on top of the chain.
  // This is a limitation, and we promise to lift it in the future!
  .setHttpApi({
    // ankr public endpoint is heavily rate-limited so expect many 429 errors
    url: 'https://rpc.ankr.com/http/tron',
    strideConcurrency: 1,
    strideSize: 1,
  })
  // Block data returned by the data source has the following structure:
  //
  // interface Block {
  //   header: BlockHeader
  //   transactions: Transaction[]
  //   logs: Log[]
  //   internalTransactions: InternalTransaction[]
  // }
  //
  // For each block item we can specify a set of fields we want to fetch via `.setFields()` method.
  // Think about it as of SQL projection.
  //
  // Accurate selection of only required fields can have a notable positive impact
  // on performance when data is sourced from Subsquid Network.
  //
  // We do it below only for illustration as all fields we've selected
  // are fetched by default.
  //
  // It is possible to override default selection by setting undesired fields to `false`.
  .setFields({
    block: {
      timestamp: true,
    },
    transaction: {
      hash: true,
    },
    log: {
      address: true,
      data: true,
      topics: true
    }
  })
  // By default, block can be skipped if it doesn't contain explicitly requested items.
  //
  // We request items via `.addXxx()` methods.
  //
  // Each `.addXxx()` method accepts item selection criteria
  // and also allows to request related items.
  //
  .addLog({
    // select logs
    where: {
      address: [USDT_ADDRESS],
      topic0: [TRANSFER_TOPIC]
    },
    // for each log selected above
    // make processor to load related transactions
    include: {
      transaction: true
    }
  })
Next, main.ts defines the data processing and storage logic. Data processing is defined in the batch handler, the callback that the processor.run() main call receives as its final argument:
main.ts
processor.run(new TypeormDatabase(), async ctx => {
  let transfers: Transfer[] = []

  for (let block of ctx.blocks) {
    for (let log of block.logs) {
      if (log.address == USDT_ADDRESS && log.topics?.[0] === TRANSFER_TOPIC) {
        assert(log.data, 'USDT transfers always carry data')
        let tx = log.getTransaction()
        // `0x` prefixes make log data compatible with evm codec
        let event = {
          topics: log.topics.map(t => '0x' + t),
          data: '0x' + log.data
        }
        let {from, to, value} = erc20.events.Transfer.decode(event)

        transfers.push(new Transfer({
          id: log.id,
          blockNumber: block.header.height,
          timestamp: new Date(block.header.timestamp),
          tx: tx.hash,
          from,
          to,
          amount: value
        }))
      }
    }
  }

  await ctx.store.insert(transfers)
})
4

Install dependencies and build

Install dependencies and build the project:
npm i
npm run build
Verify the build completed successfully by checking for the lib/ directory.
5

Start the database and processor

The processor continuously fetches data, decodes it, and stores it in PostgreSQL. All logic is defined in main.ts and is fully customizable.First, start a local PostgreSQL database (the template includes a Docker Compose file):
docker compose up -d
The processor connects to PostgreSQL using connection parameters from .env. Ensure the database is running before proceeding.
Apply database migrations:
npx squid-typeorm-migration apply
Then start the processor:
node -r dotenv/config lib/main.js
The indexer is now running and will begin processing blocks.
6

Start the GraphQL API

Start the GraphQL API to serve the transfer data:
npx squid-graphql-server
7

Query the data

You can now query your indexed data! Check it out at the GraphiQL playground at localhost:4350/graphql.
Last modified on December 17, 2025