Skip to main content

Quickstart

This 5-minute tutorial shows you how to grab a Solana Squid SDK indexer template. At the end you’ll have a complete blockchain indexer that fetches, decodes, and serves data on Whirlpool swaps on the USDC-SOL pair.

What you’ll get

Your Solana indexer (squid) will:
  • Fetch all historical USDC-SOL swaps made on Whirlpool
  • Save the data to a local PostgreSQL database
  • Start a GraphQL server with a rich API to query the indexed swaps
This tutorial focuses on Solana. For other chains, see the EVM Quickstart or Substrate Quickstart.

Prerequisites

Before you begin, ensure you have:
1

(Optional) Install Squid CLI

Install the Squid CLI globally:
npm i -g @subsquid/cli
Verify installation by running sqd --version
Squid CLI is a multi-purpose utility tool for scaffolding and managing indexers, both locally and in SQD Cloud.
2

Scaffold the indexer project

Create a new squid project from the Whirlpool USDC-SOL swaps example:
sqd init hello-squid -t https://github.com/subsquid-labs/solana-example
cd hello-squid
or, if you skipped the installation of Squid CLI
git clone https://github.com/subsquid-labs/solana-example hello-squid
cd hello-squid
3

Inspect the project structure

Explore the project structure:
src/
├── abi
│   ├── abi.support.ts
│   ├── idl.support.ts
│   ├── token-program.ts
│   └── whirlpool
│       ├── index.ts
│       ├── instructions.ts
│       └── types.ts
├── main.ts
└── model
    ├── exchange.model.ts
    └── index.ts
Key files explained: - src/abi - Utility modules generated from the Whirlpool program IDLs (src/abi/whirlpool) and handwritten (src/abi/token-program.ts) used in filtering and decoding of program data - src/model/ - TypeORM model classes used in database operations - main.ts - Main executable containing data retrieval configuration and processing logic
The main.ts file first defines the data source object and configures its data retrieval options:
main.ts
// First we create a DataSource - component,
// that defines where to get the data and what data should we get.
const dataSource = new DataSourceBuilder()
  // Provide a Subsquid Network Portal URL.
  .setPortal({
    url: 'https://portal.sqd.dev/datasets/solana-mainnet',
    http: {
      retryAttempts: Infinity
    }
  })
  // Make sure that this block is above the first block
  // of the solana-mainnet dataset!
  // Find out the current first slot from
  //   curl https://portal.sqd.dev/datasets/solana-mainnet/metadata
  .setBlockRange({from: 317617480})
  //
  // Block data returned by the data source has the following structure:
  //
  // interface Block {
  //   header: BlockHeader
  //   transactions: Transaction[]
  //   instructions: Instruction[]
  //   logs: LogMessage[]
  //   balances: Balance[]
  //   tokenBalances: TokenBalance[]
  //   rewards: Reward[]
  // }
  //
  // For each block item we can specify a set of fields we want to fetch via `.setFields()` method.
  // Think about it as of SQL projection.
  //
  // Accurate selection of only required fields can have a notable positive impact
  // on performance when data is sourced from Subsquid Network.
  //
  // We do it below only for illustration as all fields we've selected
  // are fetched by default.
  //
  // It is possible to override default selection by setting undesired fields to `false`.
  .setFields({
    block: { // block header fields
      timestamp: true
    },
    transaction: { // transaction fields
      signatures: true
    },
    instruction: { // instruction fields
      programId: true,
      accounts: true,
      data: true
    },
    tokenBalance: { // token balance record fields
      preAmount: true,
      postAmount: true,
      preOwner: true,
      postOwner: true
    }
  })
  // By default, block can be skipped if it doesn't contain explicitly requested items.
  //
  // We request items via `.addXxx()` methods.
  //
  // Each `.addXxx()` method accepts item selection criteria
  // and also allows to request related items.
  //
  .addInstruction({
    // select instructions, that:
    where: {
      programId: [whirlpool.programId], // where executed by Whirlpool program
      d8: [whirlpool.instructions.swap.d8], // have first 8 bytes of .data equal to swap descriptor
      ...whirlpool.instructions.swap.accountSelection({ // limiting to USDC-SOL pair only
        whirlpool: ['7qbRF6YsyGuLUVs6Y1q64bdVrfe4ZcUUz1JRdoVNUJnm']
      }),
      isCommitted: true // where successfully committed
    },
    // for each instruction selected above
    // make sure to also include:
    include: {
      innerInstructions: true, // inner instructions
      transaction: true, // transaction, that executed the given instruction
      transactionTokenBalances: true, // all token balance records of executed transaction
    }
  })
  .build()
Next, main.ts defines the data processing and storage logic. Data processing is defined in the batch handler, the callback that the run() function as its final argument:
main.ts
// Below we create a `TypeormDatabase`.
//
// It provides restricted subset of [TypeORM EntityManager API](https://typeorm.io/working-with-entity-manager)
// as a persistent storage interface and works with any Postgres-compatible database.
//
// Note, that we don't pass any database connection parameters.
// That's because `TypeormDatabase` expects a certain project structure
// and environment variables to pick everything it needs by convention.
// Companion `@subsquid/typeorm-migration` tool works in the same way.
//
// For full configuration details please consult
// https://github.com/subsquid/squid-sdk/blob/278195bd5a5ed0a9e24bfb99ee7bbb86ff94ccb3/typeorm/typeorm-config/src/config.ts#L21
const database = new TypeormDatabase({supportHotBlocks: true})


// Now we are ready to start data processing
run(dataSource, database, async ctx => {
  // Block items that we get from `ctx.blocks` are flat JS objects.
  //
  // We can use `augmentBlock()` function from `@subsquid/solana-objects`
  // to enrich block items with references to related objects and
  // with convenient getters for derived data (e.g. `Instruction.d8`).

  let blocks = ctx.blocks.map(augmentBlock)

  let exchanges: Exchange[] = []

  for (let block of blocks) {
    for (let ins of block.instructions) {
      // https://read.cryptodatabytes.com/p/starter-guide-to-solana-data-analysis
      if (ins.programId === whirlpool.programId && ins.d8 === whirlpool.instructions.swap.d8) {
        let exchange = new Exchange({
          id: ins.id,
          slot: block.header.number,
          tx: ins.getTransaction().signatures[0],
          timestamp: new Date(block.header.timestamp * 1000)
        })

        assert(ins.inner.length == 2)
        let srcTransfer = tokenProgram.instructions.transfer.decode(ins.inner[0])
        let destTransfer = tokenProgram.instructions.transfer.decode(ins.inner[1])

        let srcBalance = ins.getTransaction().tokenBalances.find(tb => tb.account == srcTransfer.accounts.source)
        let destBalance = ins.getTransaction().tokenBalances.find(tb => tb.account === destTransfer.accounts.destination)

        let srcMint = ins.getTransaction().tokenBalances.find(tb => tb.account === srcTransfer.accounts.destination)?.preMint
        let destMint = ins.getTransaction().tokenBalances.find(tb => tb.account === destTransfer.accounts.source)?.preMint

        assert(srcMint)
        assert(destMint)

        exchange.fromToken = srcMint
        exchange.fromOwner = srcBalance?.preOwner || srcTransfer.accounts.source
        exchange.fromAmount = srcTransfer.data.amount

        exchange.toToken = destMint
        exchange.toOwner = destBalance?.postOwner || destBalance?.preOwner || destTransfer.accounts.destination
        exchange.toAmount = destTransfer.data.amount

        exchanges.push(exchange)
      }
    }
  }

  await ctx.store.insert(exchanges)
})
4

Install dependencies and build

Install dependencies and build the project:
npm i
npm run build
Verify the build completed successfully by checking for the lib/ directory.
5

Start the database and processor

The processor continuously fetches data, decodes it, and stores it in PostgreSQL. All logic is defined in main.ts and is fully customizable.First, start a local PostgreSQL database (the template includes a Docker Compose file):
docker compose up -d
The processor connects to PostgreSQL using connection parameters from .env. Ensure the database is running before proceeding.
Apply database migrations:
npx squid-typeorm-migration apply
Then start the processor:
node -r dotenv/config lib/main.js
The indexer is now running and will begin processing blocks.
6

Start the GraphQL API

Start the GraphQL API to serve the transfer data:
npx squid-graphql-server
7

Query the data

You can now query your indexed data! Check it out at the GraphiQL playground at localhost:4350/graphql.
Last modified on December 17, 2025