Overview
BlockDB delivers the highest on-chain data granularity available, starting directly from decoded EVM logs and extending into multi-level analytical layers. Each layer preserves schema stability, row-level lineage, and deterministic hashes, allowing you to trace any record back to its originating transaction. You use these dataset levels to choose the exact resolution you need — from raw execution data to fully derived analytics optimized for modeling, research, and real-time applications.Granularity Levels
Level 1 — Decoded Raw On-Chain Events
Raw EVM execution data decoded into structured, schema-stable tables.- Logs, transactions, calls, and function results
- Full event arguments in decoded form
- One row per event (no aggregation)
- Lineage hash links each row to the source block and log index
Level 2 — Protocol-Aware Entities
Enriched datasets derived from Level 1 with application-level context.- ERC-20 tokens
- ERC-721 tokens
- Liquidity pools (e.g., Uniswap v2/v3/v4)
- Pool configuration, fee tiers, tick spacing, token metadata
Level 3 — State & Liquidity Snapshots
Structured snapshots of DeFi protocol states, computed per block or per event.- Liquidity pool reserves (token0/token1 amounts, active tick, square root price)
- Real-time liquidity depth derived from tick state
- Price ranges directly inferred from Uniswap v3 tick math
In Token-to-Token Prices L3, you receive the full price range implied by the active Uniswap v3 tick (upper and lower bounds). This gives you true microstructure-level visibility. Best for: price modeling, slippage estimation, liquidity defense strategies, arbitrage detection.
Level 4 — Aggregated Analytics
Higher-order metrics derived from previous levels.- VWAP, LWAP
- OHLC prices
- Aggregated liquidity and volume metrics
- Impact curves and depth-weighted prices
Why Granularity Matters
Traceability
Every record includes a lineage hash that lets you verify:- which block it came from
- which log indices contributed
- which derived rows depend on which raw rows
Deterministic Modeling
Stable schemas and versioning ensure long-term reproducibility for:- backtests
- ML pipelines
- regulatory and audit requirements
Microstructure Precision
High granularity allows you to:- compute exact price impact
- reconstruct pool state at any block
- analyze liquidity fragmentation
- detect regime shifts earlier
When to Choose Higher Granularity
Use higher granularity when you need:- raw execution accuracy
- event-level resolution
- precise pool state
- deterministic replay of historical market conditions
- faster ingestion
- ready-to-use price or liquidity metrics
- simplified modeling inputs
If you want, I can add diagrams, Mintlify
<Cards> or <Steps> components, or cross-link this to dataset pages such as Token-to-Token Prices L1-L3 or Liquidity Pool Reserves.