Skip to main content

Why Use the REST API

  • Sub-second access to any dataset supported by the EVM API suite.
  • Works behind outbound-only firewalls—no need to expose webhooks.
  • Fine-grained filters reduce egress costs; pull only the chains, blocks, or pools you need.

Core Concepts

  • Endpoints: Each dataset has a dedicated path (e.g., /evm/blocks, /evm/prices-spot-depth). All requests are POST.
  • Pagination: Responses contain cursor and count. Pass the cursor from the previous response to request the next page.
  • Ordering: Use range filters (range.from/range.to, _updated_at windows, sequence numbers) to advance chronologically.
  • Authentication: OAuth 2.0 bearer tokens scoped to your entitlements. Rotate tokens per your security policy.

Example Poller

curl -X POST https://api.blockdb.io/v1/evm/blocks \
  -H "Authorization: Bearer $BLOCKDB_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "chain_id": 1,
    "range": {"from": 19000000},
    "fields": ["number","hash","timestamp","_tracing_id"],
    "cursor": "'$CURSOR'"
  }'
Pseudo-loop:
  1. Call the endpoint with cursor=null to seed the stream.
  2. Persist the returned cursor and latest _updated_at.
  3. Re-run every few seconds/minutes; include cursor to continue where you left off.
  4. On failure, replay using the last committed cursor (idempotent).

Best Practices

  • Backoff: Implement exponential backoff and respect rate limits documented in your contract.
  • Schema evolution: Subscribe to Schema Governance so pollers can handle new fields gracefully.
  • Verification: Periodically call Verification endpoints for sampled _tracing_id values.
  • Upserts: Merge by dataset-specific key + _tracing_id to avoid duplication.
Use separate pollers per dataset family (ledger, pricing, lineage) to prevent one noisy workload from blocking another.