Indexing the Orca DEX
In this step-by-step tutorial we will look into a squid that gets data about the Orca Exchange NFTs, their transfers and owners from the Solana blockchain. Pre-requisites: Node.js v20 or newer, Git, Docker.Download the project
Begin by retrieving the template and installing the dependencies:Interfacing with the Whirlpool program
First, we inspect the data available for indexing. In Solana, most programs use the Anchor framework. Anchor makes the metadata describing the shape of the instructions, transactions and contract variables available as an Interface Definition Language (IDL) JSON file. For many popular programs (including Whirlpool) IDL files are published on-chain. SQD provides a tool for retrieving program IDLs and generating boilerplate ABI code for data decoding. This can be done withsrc/abi is the destination folder and the whirlpool suffix sets the base name for the generated file.
Checking out the generated src/abi/whirlpool/instructions.ts file. It exports an instruction instance for every instruction in the ABI. Here’s how it is initialized for swap:
d8 are the eight bytes that the relevant instruction data starts with.
Configuring the data source
“Data source” is a component that defines what data should be retrieved and where to get it. To configure the data source to retrieve the data produced by theswap instruction of the Whirlpool program, we initialize it like this:
src/main.ts
'https://v2.archive.subsquid.io/network/solana-mainnet'is the address for the public SQD Network gateway for Solana mainnet. The only other Solana-compatible dataset currently available is Eclipse Testnet, with the gateway at'https://v2.archive.subsquid.io/network/eclipse-testnet'. Many other networks are available on EVM and Substrate - see the exhaustive public networks list.'process.env.SOLANA_NODE'is an environment variable pointing at a public RPC endpoint we chose to use in this example. When an endpoint is available, the processor will begin ingesting data from it once it reaches the highest block available within SQD Network.259_984_950is first Solana block currently indexed by SQD.- The argument of
addInstruction()is a set of filters that tells the processor to retrieve all data on the swap instruction of the Whirlpool program with discriminator matching the hash of the<namespace>:<instruction>of the swap instruction. - The argument of
setFields()specifies the exact fields we need for every data item type.
SolanaDataSource reference for more options.
With a data source it becomes possible to retrieve filtered blockchain data from SQD Network, transform it and save the result to a destination of choice.
Decoding the event data
The other part the squid processor (the ingester process of the indexer) is the callback function used to process batches of the filtered data, the batch handler. In Solana Squid SDK it is typically defined within arun() call, like this:
dataSourceis the data source object described in the previous sectiondatabaseis aDatabaseimplementation specific to the target data sink. We want to store the data in a PostgreSQL database and present with a GraphQL API, so we provide aTypeormDatabaseobject here.ctxis a batch context object that exposes a batch of data (atctx.blocks) and any data persistence facilities derived fromdb(atctx.store). See Block data for Solana for details on how the data batches are presented.
src/main.ts
swap instruction from the Whirlpool program and decodes the data of each inner instruction.
Then it retrieves the transaction from the decoded inner instruction and source and destination accounts.
The decoding is done with the tokenProgram.instructions.transfer.decode function from the Typescript ABI provided in the project.
At this point the squid is ready for its first test run. Execute

