SQD Network portals are currently in open beta. Please report any bugs or suggestions to the SQD Portal chat or to Squid Devs .
This guide walks you through migrating EVM indexer setups that use RPC for real-time data ingestion to ingesting real-time data from an SQD Network portal. The guide works regardless of whether you have already migrated to using a portal for historical data.
Prerequisites
Before starting, ensure you have:
An existing EVM indexer using @subsquid/evm-processor
Node.js and npm installed
Access to an SQD Portal URL for your network
Migration Steps
Install New Packages
Uninstall the old processor package and install the new portal-api packages: npm uninstall @subsquid/evm-processor
npm i @subsquid/evm-stream@portal-api @subsquid/evm-objects@portal-api @subsquid/batch-processor@portal-api @subsquid/logger
Verify Portal Support
Make sure you have an SQD portal URL for your dataset and that real-time data is supported. The portal URL follows this pattern: https://portal.sqd.dev/datasets/<dataset-slug>
where <dataset-slug> is the last path segment of the gateway URL for your network found on the networks page . Example: Ethereum
Verify Real-Time Support
For Ethereum mainnet, the portal URL is: https://portal.sqd.dev/datasets/ethereum-mainnet
Query the /metadata endpoint to verify real-time data support: curl -s --fail-with-body https://portal.sqd.dev/datasets/ethereum-mainnet/metadata
The response should show "real_time": true: {
"dataset" : "ethereum-mainnet" ,
"aliases" : [],
"real_time" : true ,
"start_block" : 0
}
Update Processor Configuration
Replace your processor configuration (typically at src/processor.ts or src/main.ts) with a data source configuration. Update Imports Replace the processor imports with the new data source and object imports: -import {EvmBatchProcessor} from '@subsquid/evm-processor'
+import {DataSourceBuilder} from '@subsquid/evm-stream'
+import {augmentBlock} from '@subsquid/evm-objects'
+import {createLogger} from '@subsquid/logger'
Replace Processor Initialization Replace EvmBatchProcessor initialization with DataSourceBuilder: -const processor = new EvmBatchProcessor()
- .setGateway('https://v2.archive.subsquid.io/network/ethereum-mainnet')
- .setRpcEndpoint('https://rpc.ankr.com/eth')
- .setFinalityConfirmation(75)
+const dataSource = new DataSourceBuilder()
+ .setPortal('https://portal.sqd.dev/datasets/ethereum-mainnet')
RPC endpoint and finality confirmation settings are no longer needed.
Update Data Requests Rewrite data requests using the new where-include-range syntax: // Old flat object syntax
. addLog ({
address: [ USDC_CONTRACT_ADDRESS ],
topic0: [ usdcAbi . events . Transfer . topic ],
transaction: true , // include the parent transaction
range: { from: 6_082_465 },
})
Build the Data Source Include a .build() call at the end of the data source initialization: .setFields({
log: {
transactionHash: true,
},
})
+.build()
If you pass your processor object between source files (e.g., from src/processor.ts to src/main.ts), pass the dataSource object in the same way.
Update the Run Function
Replace the processor.run() call with the unified run function. Import the Run Function Add the run function import to your main file: +import {run} from '@subsquid/batch-processor'
Create a Logger Manually create a logger for your batch handler: const logger = createLogger ( 'sqd:processor:mapping' )
Replace processor.run() Update the run call and enrich the context: -processor.run(db, async (ctx) => {
+run(dataSource, db, async (simpleCtx) => {
+ const ctx = {
+ ...simpleCtx,
+ blocks: simpleCtx.blocks.map(augmentBlock),
+ log: logger
+ }
View Complete Diff Example
Here’s a full example of the changes up to this point: --- a/src/main.ts
+++ b/src/main.ts
@@ -1,29 +1,43 @@
-import {EvmBatchProcessor} from '@subsquid/evm-processor'
+import {DataSourceBuilder} from '@subsquid/evm-stream'
+import {augmentBlock} from '@subsquid/evm-objects'
+import {run} from '@subsquid/batch-processor'
+import {createLogger} from '@subsquid/logger'
import {TypeormDatabase} from '@subsquid/typeorm-store'
import * as usdcAbi from './abi/usdc'
import {UsdcTransfer} from './model'
const USDC_CONTRACT_ADDRESS = '0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48'
-const processor = new EvmBatchProcessor()
- .setGateway('https://v2.archive.subsquid.io/network/ethereum-mainnet')
- .setRpcEndpoint('https://rpc.ankr.com/eth')
- .setFinalityConfirmation(75)
+const dataSource = new DataSourceBuilder()
+ .setPortal('https://portal.sqd.dev/datasets/ethereum-mainnet')
.addLog({
+ where: {
+ address: [USDC_CONTRACT_ADDRESS],
+ topic0: [usdcAbi.events.Transfer.topic],
+ },
+ include: {
+ transaction: true, // include the parent transaction
+ },
range: { from: 6_082_465 },
- address: [USDC_CONTRACT_ADDRESS],
- topic0: [usdcAbi.events.Transfer.topic],
- transaction: true, // include the parent transaction
})
.setFields({
log: {
transactionHash: true,
},
})
+ .build()
const db = new TypeormDatabase({supportHotBlocks: true})
-processor.run(db, async (ctx) => {
+const logger = createLogger('sqd:processor:mapping')
+
+run(dataSource, db, async (simpleCtx) => {
+ const ctx = {
+ ...simpleCtx,
+ blocks: simpleCtx.blocks.map(augmentBlock),
+ log: logger,
+ }
+
const transfers: UsdcTransfer[] = []
for (let block of ctx.blocks) {
for (let log of block.logs) {
Add RPC Client (Optional)
Only complete this step if you use direct RPC calls in your batch handler code. If you don’t make direct contract state queries, skip this step.
If you use direct RPC calls in your batch handler, you’ll need to add an RPC client to your context. Install RPC Client npm i @subsquid/rpc-client
Initialize RPC Client Import and initialize an RpcClient: import { RpcClient } from '@subsquid/rpc-client'
const rpcClient = new RpcClient ({
url: 'https://my_rpc_url' ,
rateLimit: 100
})
Enrich Context In your batch handler, add the _chain field to the context: const ctx = {
...simpleCtx,
blocks: simpleCtx.blocks.map(augmentBlock),
log: logger,
+ _chain: {
+ client: rpcClient
+ }
}
Now your contract state queries will work as before: const usdcContract = new usdcAbi . Contract (
ctx ,
blocks [ 0 ]. header ,
USDC_CONTRACT_ADDRESS
)
// Query decimals via direct call to state at blocks[0].header.height
const decimals = await usdcContract . decimals ()
Migration Complete
Your indexer is now ready to source real-time data from an SQD Network portal. The portal provides improved performance and reliability compared to RPC endpoints.
Example Migrations
Complete migration examples for a simple USDC transfers indexer are available:
Next Steps