Skip to main content

Filters

Updated on
Feb 04, 2026
API Credits Billing

Streams consumes API credits based on the number of blocks processed by your Stream, determined by network and dataset type. Filters help you shape your data pipeline by controlling exactly what data reaches your destination. Use the API Credits Calculator to estimate your usage.

Filters enable you to customize and filter your streams payload using JavaScript (ECMAScript) code. With the help of filters, you can match specific patterns and rules against the data from your stream, and personalize the data sent to your destination. This feature can be configured within the Quicknode developer portal, and through the Streams REST API.

Understanding Filters

In Streams, a filter_function is an optional configuration option for your stream. This function must be named main and it modifies the data out of the stream before it's sent to its destination. Using filters, you can precisely control the data you receive, ensuring you stream only the data your application needs while optimizing your downstream data pipeline.

Benefits of Filters

  • Shape Your Data Pipeline: Focus on exactly the data you need by filtering out irrelevant blocks and transactions.
  • Optimize Downstream Processing: Receive only filtered data, reducing bandwidth and storage requirements for your destination systems.
  • Customizability: Implement custom filter functions to match specific patterns or criteria, offering great flexibility in data handling.
  • Enhanced Data Relevance: Filters ensure that you receive data that is directly relevant to your needs, increasing the overall usefulness of the data.
  • Transform Data Before Delivery: Customize the payload from your stream before it is sent to your Streams destination.

How Filters Work with Billing

Streams consumes API credits based on blocks processed, not data delivered. Filtering does not reduce API credit consumption you are billed for every block your Stream processes, regardless of what the filter returns.

Filters help you shape your data pipeline by controlling what data reaches your destination, optimizing bandwidth and storage for your downstream systems. Use the API Credits Calculator to estimate your usage.

Legacy GB Billing Information (Existing Users Only)

Streams usage is metered by the amount of data delivered to your destination, based on your Quicknode rate plan. Note that there is a 2.5kb minimum per processed block for Streams on Free Trial plans.

Free Trial Plan

  • Includes 1GB of data per month across all streams
  • Minimum data metered per block is 2.5KB when using filters
  • Data is charged at actual size when not using filters

Paid Plans

  • Pay only for the exact size of data you receive
  • No minimum charges
  • Example: If you filter a 10KB block down to 1KB, you only pay for 1KB

How Filtering Affects Your Bill (GB Billing)

Filters reduce costs by sending only the data you need. Here's how data is metered:

What You GetFree Trial PlanPaid Plans
1KB of data1KB1KB
3KB of data3KB3KB
Filtered to 1KB2.5KB1KB
Filtered to 3KB3KB3KB
Filtered to 0KB2.5KB0KB

Available Features by Plan

Free Trial Plan

  • Basic streaming
  • 1GB monthly data limit
  • No compression
  • No reorg handling

Paid Plans

  • All free features, plus:
  • Compression to reduce data size
  • Reorg handling

Track Your Usage

Monitor your Streams activity in real-time:

  • Stream Dashboard: Per-stream statistics
  • Filter Metrics: Performance and blocks processed
  • Usage Panel: Stream activity and blocks processed
  • Billing Page: Credit consumption history

Example Filter Functions

Below are examples of how you might define a filter function to target data from Streams.

Please note:

  • Your filter must be named main.
  • Your filter function must return an object.

Payload Shape (Data & Metadata)

Streams will send the payload data in stream.data and the metadata in stream.metadata.

Return Hash and Block Number

This filter works with the block dataset.

function main(stream) {
const data = stream.data

var numberDecimal = parseInt(data[0].number, 16)
var filteredData = {
hash: data[0].hash,
number: numberDecimal,
}
return filteredData
}

Get Receipts for ERC20 Transfers

This filter works with the block_with_receipts dataset.

function main(stream) {
try {
const data = stream.data
var filteredReceipts = []

data.receipts.forEach(receipt => {
let relevantLogs = receipt.logs.filter(
log =>
log.topics[0] ===
'0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef' &&
log.topics.length === 3
)
if (relevantLogs.length > 0) {
filteredReceipts.push(receipt)
}
})

return {
totalReceipts: data.receipts.length,
filteredCount: filteredReceipts.length,
receipts: filteredReceipts,
}
} catch (e) {
return { error: e.message }
}
}

Get Transactions and Receipts for Specific Addresses

This filter works with the block_with_receipts dataset.

function main(data) {
try {
const data = stream.data

const addresses = [
'0x56220b7e25c7d0885159915cdebf5819f2090f57',
'0x351e1b4079cf180971025a3b35dadea1d809de26',
'0xa61551e4e455edebaa7c59f006a1d2956d46eecc',
]
var addressSet = new Set(addresses.map(address => address.toLowerCase()))
var paddedAddressSet = new Set(
addresses.map(
address => '0x' + address.toLowerCase().slice(2).padStart(64, '0')
)
)

var matchingTransactions = []
var matchingReceipts = []

data.block.transactions.forEach(transaction => {
let transactionMatches =
(transaction.from && addressSet.has(transaction.from.toLowerCase())) ||
(transaction.to && addressSet.has(transaction.to.toLowerCase()))

if (transactionMatches) {
matchingTransactions.push(transaction)
}
})

data.receipts.forEach(receipt => {
let receiptMatches =
receipt.logs &&
receipt.logs.some(
log =>
log.topics &&
log.topics.length > 1 &&
(paddedAddressSet.has(log.topics[1]) ||
(log.topics.length > 2 && paddedAddressSet.has(log.topics[2])))
)
if (receiptMatches) {
matchingReceipts.push(receipt)
}
})

if (matchingTransactions.length === 0 && matchingReceipts.length === 0) {
return null
}

return {
transactions: matchingTransactions,
receipts: matchingReceipts,
}
} catch (e) {
return { error: e.message }
}
}

How to Use Filters

To apply filters to your stream, specify the filter_function configuration option when creating a stream using the Streams REST API and in the "Stream payload" step of the Streams configuration wizard. You can then define your custom filter function to match specific criteria or patterns according to your requirements.

To omit sending payloads when your filter doesn't match data within a specific block, you can add conditional logic to return null instead of an empty result.

Using Key-Value Store with Streams filters


Key-Value Store can be accessed and managed seamlessly within a Streams filter. You can create new lists and key-value sets, add and remove items from your lists, update key-value sets, retrieve set value, and check if data from the streaming dataset matches an item on your list. To learn more about Key-Value Store, please visit the Key-Value Store documentation


note

All qnLib methods are asynchronous and return Promises. Always use the await keyword when calling these methods, and ensure your filter's main function is declared with async.

Available Key-Value Store functions inside your Streams filter

Lists


  • qnUpsertList: Creates or updates a new list in Key-Value Store.
  • qnAddListItem: Adds an item to a specific list in Key-Value Store.
  • qnRemoveListItem: Removes an item from a specific list in Key-Value Store.
  • qnContainsListItems: Batch lookup to check many items for membership in a specific list in Key-Value Store. Always batch all lookups into as few calls as possible to reduce round trips to the KV store—this is optimized for performance.
  • qnDeleteList: Deletes a specific list in Key-Value Store.

Example filter code for lists

The filter code below can be used within Streams to demonstrate ways that Key-Value Store lists can be used within your Streams filter.

async function main() {
// List of results for each operation
let results = {}

try {
// Create a new list
results.createList = await qnLib.qnUpsertList('list_docs_example', {
add_items: ['item1', 'item2'],
})

// Update a list
results.qnUpsertList = await qnLib.qnUpsertList('list_docs_example', {
add_items: ['item3'],
remove_items: ['item1'],
})

// Add item to the list
results.qnAddListItem4 = await qnLib.qnAddListItem('list_docs_example', 'item4')
results.qnAddListItem5 = await qnLib.qnAddListItem('list_docs_example', 'item5')

// Remove item from the list
results.qnRemoveListItem4 = await qnLib.qnRemoveListItem('list_docs_example', 'item4')

// Batch check multiple items for list membership (one round trip)
results.qnContainsListItems = await qnLib.qnContainsListItems(
'list_docs_example',
['item1', 'item2']
)

// Delete the list
results.qnDeleteList = await qnLib.qnDeleteList('list_docs_example')
} catch (error) {
results.error = error.message
}

return results
}

Sets


  • qnAddSet: Creates a key-value set in Key-Value Store.
  • qnBulkSets: Creates and removes bulk key-value sets in Key-Value Store.
  • qnDeleteSet: Deletes a key-value set from Key-Value Store.
  • qnGetSet: Retrieves the value of a specific key from Key-Value Store.
  • qnListAllSets: List all keys for sets in Key-Value Store.

Example filter code for Key-Value Store sets

The filter code below can be used within Streams to demonstrate ways that Key-Value Store sets can be used within your Streams filter.

async function main() {
// List of results for each operation
let results = {}

try {
// Create a set
results.qnAddSet = await qnLib.qnAddSet('set_docs_example', 'value1')

// Get a set
results.qnGetSet = await qnLib.qnGetSet('set_docs_example')

// Get all sets
results.qnListAllSets = await qnLib.qnListAllSets()

// Bulk add/remove sets
results.qnBulkSets = await qnLib.qnBulkSets({
add_sets: {
set_docs_example2: 'value1',
set_docs_example3: 'value2',
},
delete_sets: ['set_docs_example1'],
})

// Get all sets after bulk
results.qnListAllSets2 = await qnLib.qnListAllSets()

// Delete key values
results.qnDeleteSet2 = await qnLib.qnDeleteSet('set_docs_example2')
results.qnDeleteSet3 = await qnLib.qnDeleteSet('set_docs_example3')
} catch (error) {
results.error = error.message
}

return results
}

Common Pitfalls with Key-Value Store Methods

Missing await keyword

All qnLib methods are asynchronous and return Promises. Failing to use await will result in unexpected behavior:

// INCORRECT - returns a Promise, not the actual result
function main() {
const result = qnLib.qnContainsListItems('myList', ['item']);
// result is a Promise object, not the boolean array
return result;
}

// CORRECT - returns the resolved value from the Promise
async function main() {
const result = await qnLib.qnContainsListItems('myList', ['item']);
// result contains the index-aligned boolean array
return result;
}

Boolean checks with async methods

When checking list membership, use qnContainsListItems and batch your lookups. Be careful to await the result when using it in conditionals:

// INCORRECT - checks if Promise exists, not the result
function main() {
if (qnLib.qnContainsListItems('myList', ['item'])) {
// This will always evaluate to true regardless of whether
// the item is in the list or not!
return { exists: true };
}
return { exists: false };
}

// CORRECT - batch lookup, then use the index-aligned boolean array
async function main() {
const hits = await qnLib.qnContainsListItems('myList', ['item']);
if (hits[0]) {
return { exists: true };
}
return { exists: false };
}

Error handling for large operations

When working with large datasets, it's important to implement proper error handling:

// INCORRECT - no error handling for large list operations
async function main() {
// This might fail with large lists
await qnLib.qnUpsertList('myList', { add_items: veryLargeArray });
return { success: true };
}

// CORRECT - handles potential size-related errors
async function main() {
try {
// Use chunking for large operations
const chunkSize = 50000;
for (let i = 0; i < veryLargeArray.length; i += chunkSize) {
const chunk = veryLargeArray.slice(i, i + chunkSize);
await qnLib.qnUpsertList('myList', { add_items: chunk });
}
return { success: true };
} catch (error) {
return { error: `Failed to update list: ${error.message}` };
}
}

Cascading async operations

Chain operations carefully to avoid cascading failures:

// INCORRECT - continues execution even if a prerequisite operation fails
async function main() {
await qnLib.qnAddListItem('myList', 'important_item');
const hits = await qnLib.qnContainsListItems('myList', ['important_item']);
return { found: hits[0] };
}

// CORRECT - checks intermediate results before continuing
async function main() {
try {
const addResult = await qnLib.qnAddListItem('myList', 'important_item');
if (!addResult) {
return { error: "Failed to add item to list" };
}
const hits = await qnLib.qnContainsListItems('myList', ['important_item']);
return { found: hits[0] };
} catch (error) {
return { error: `Operation failed: ${error.message}` };
}
}

Performance: batch list lookups with qnContainsListItems


Batch all list membership checks into as few qnContainsListItems calls as possible to reduce round trips to the Key-Value Store. This batch lookup is optimized for performance.

  • qnContainsListItems(listId, items[])Promise<boolean[]> (index-aligned to items)
// Gather all addresses, then one batched lookup — minimizes round trips.
async function main(stream) {
const listId = "WATCHLIST_EVM_MAINNET";

// Gather from/to
const addrs = [];
for (const chunk of stream.data ?? []) {
for (const tx of chunk.transactions ?? []) {
if (tx.from) addrs.push(tx.from.toLowerCase());
if (tx.to) addrs.push(tx.to.toLowerCase());
}
}
if (addrs.length === 0) return null;

// De-dup before the single batch call
const uniq = Array.from(new Set(addrs));
const hits = await qnLib.qnContainsListItems(listId, uniq);

// Fast lookup map (index-aligned to uniq)
const hitMap = new Map(uniq.map((a, i) => [a, !!hits[i]]));

// Return matched txs
const matched = [];
for (const chunk of stream.data ?? []) {
for (const tx of chunk.transactions ?? []) {
const f = tx.from?.toLowerCase();
const t = tx.to?.toLowerCase();
if ((f && hitMap.get(f)) || (t && hitMap.get(t))) matched.push(tx);
}
}

return matched.length ? { data: matched } : null;
}

When you need to check list membership for multiple values (e.g. many tx.from/tx.to or many log addresses), collect them, de-duplicate, then call qnContainsListItems(listId, uniq) once and use the returned boolean array. Avoid multiple per-item lookups.

Decoding EVM Data

When working with EVM-compatible chains, you can decode transaction receipts and logs using the decodeEVMReceipts function. This function takes raw transaction receipts and contract ABIs as inputs, and transforms the encoded blockchain data into a human-readable format.

The decoding process:

  1. Matches event signatures in transaction logs with the provided ABIs
  2. Decodes parameters according to their types (addresses, integers, strings, etc.)
  3. Returns structured data with named parameters instead of raw hex data in decodedLogs object

The decodeEVMReceipts function accepts two parameters:

  • receipts: Array of transaction receipts to decode
  • abis: Array of contract ABIs (can be passed as strings or objects)

This enables you to:

  • Monitor specific smart contract events
  • Extract and process event parameters
  • Filter transactions based on decoded data
  • Track multiple contracts and event types simultaneously

Below you can find examples of Filter code that relies on decodeEVMReceipts for decoding EVM data, using block_with_receipts dataset.

Basic ERC20 Transfer Events

function main(stream) {
const erc20Abi = `[{
"anonymous": false,
"inputs": [
{"indexed": true, "type": "address", "name": "from"},
{"indexed": true, "type": "address", "name": "to"},
{"indexed": false, "type": "uint256", "name": "value"}
],
"name": "Transfer",
"type": "event"
}]`

const data = stream.data
var result = decodeEVMReceipts(data[0].receipts, [erc20Abi])

// Filter for receipts with decoded logs
result = result.filter(
receipt => receipt.decodedLogs && receipt.decodedLogs.length > 0
)

return { result }
}

Dynamic ABI Loading with Key-Value Store

You can store contract ABIs in Key-Value Store and load them dynamically in your filter. This approach is particularly useful when working with multiple contracts or when ABIs need to be updated frequently.

Uploading ABIs to Key-Value Store

Before using ABIs in your filter example below, you need to upload them to the Key-Value Store. You can do this using the REST API. You need to create an API key to use our REST APIs for Key-Value Store in Quicknode dashboard.

curl -X POST \
"https://api.quicknode.com/kv/rest/v1/sets" \
-H "accept: application/json" \
-H "Content-Type: application/json" \
-H "x-api-key: YOUR_API_KEY" \
-d '{
"key": "usdc_abi",
"value": "[{\"anonymous\":false,\"inputs\":[{\"indexed\":true,\"name\":\"from\",\"type\":\"address\"},{\"indexed\":true,\"name\":\"to\",\"type\":\"address\"},{\"indexed\":false,\"name\":\"value\",\"type\":\"uint256\"}],\"name\":\"Transfer\",\"type\":\"event\"}]"
}'

USDC and Uniswap ABI example

async function main(stream) {
// Fetch ABI from Key-Value Store — requires you to have ABIs uploaded there first
const usdcAbi = await qnLib.qnGetSet('usdc_abi')
const uniswapAbi = await qnLib.qnGetSet('uniswap_abi')

const USDC_ADDRESS =
'0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'.toLowerCase()

const data = stream.data ? stream.data : stream
var result = decodeEVMReceipts(data[0].receipts, [usdcAbi, uniswapAbi])

// Filter for USDC events only
result = result.filter(
receipt =>
receipt.decodedLogs &&
receipt.decodedLogs.length > 0 &&
receipt.decodedLogs.some(
log => log.address?.toLowerCase() === USDC_ADDRESS
)
)

return { result }
}

Complex Event Decoding (NFT Marketplace)

function main(stream) {
// OpenSea Seaport OrderFulfilled event
// test on Ethereum, 21292520 block
const seaportAbi = `[{
"anonymous": false,
"inputs": [
{"type": "bytes32", "name": "orderHash", "indexed": false},
{"type": "address", "name": "offerer", "indexed": true},
{"type": "address", "name": "zone", "indexed": true},
{"type": "address", "name": "recipient", "indexed": false},
{
"components": [
{"type": "uint8", "name": "itemType"},
{"type": "address", "name": "token"},
{"type": "uint256", "name": "identifier"},
{"type": "uint256", "name": "amount"}
],
"type": "tuple[]",
"name": "offer",
"indexed": false
},
{
"components": [
{"type": "uint8", "name": "itemType"},
{"type": "address", "name": "token"},
{"type": "uint256", "name": "identifier"},
{"type": "uint256", "name": "amount"},
{"type": "address", "name": "recipient"}
],
"type": "tuple[]",
"name": "consideration",
"indexed": false
}
],
"name": "OrderFulfilled",
"type": "event"
}]`

const SEAPORT_ADDRESS =
'0x00000000006c3852cbEf3e08E8dF289169EdE581'.toLowerCase()

const data = stream.data
var result = decodeEVMReceipts(data[0].receipts, [seaportAbi])

result = result.filter(
receipt =>
receipt.decodedLogs &&
receipt.decodedLogs.length > 0 &&
receipt.decodedLogs.some(
log =>
log.address?.toLowerCase() === SEAPORT_ADDRESS &&
log.name === 'OrderFulfilled'
)
)

return { result }
}

Multiple Contract Events

async function main(stream) {
// Load multiple ABIs for different protocols
const aaveAbi = await qnLib.qnGetSet('aave_v3_pool_abi')
const uniswapAbi = await qnLib.qnGetSet('uniswap_v3_pool_abi')
const curveAbi = await qnLib.qnGetSet('curve_pool_abi')

const AAVE_ADDRESS =
'0x87870Bca3F3fD6335C3F4ce8392D69350B4fA4E2'.toLowerCase()
const UNISWAP_POOL =
'0x8ad599c3A0ff1De082011EFDDc58f1908eb6e6D8'.toLowerCase()

const data = stream.data
var result = decodeEVMReceipts(data[0].receipts, [
aaveAbi,
uniswapAbi,
curveAbi,
])

// Get all DeFi protocol events
result = result.filter(
receipt =>
receipt.decodedLogs &&
receipt.decodedLogs.length > 0 &&
receipt.decodedLogs.some(log =>
[AAVE_ADDRESS, UNISWAP_POOL].includes(log.address?.toLowerCase())
)
)

return { result }
}

Working with Decoded Data

The decoded data will include:

  • Transaction metadata (hash, block number, etc.)
  • Decoded events in decodedLogs array
  • Event parameters with their proper types (address, uint256, etc.)

For example, a decoded ERC20 transfer event would look like:

{
"address": "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48",
"name": "Transfer",
"from": "0x88e6a0c2ddd26feeb64f039a2c41296fcb3f5640",
"to": "0x6f1cdbbb4d53d226cf4b917bf768b94acbab6168",
"value": "138908045566"
}

Common Use Cases

  • Monitoring specific contract events
  • Filtering transactions by event type
  • Extracting parameter values from events
  • Tracking token transfers and approvals
  • Decoding complex marketplace orders
  • Monitoring DeFi protocol activities

Managing ABIs with Key-Value Store

You can store and manage your contract ABIs using Key-Value Store:

  1. Upload ABIs using REST APIs
  2. Update ABIs when contracts are upgraded
  3. Load multiple ABIs dynamically in your filter
  4. Share ABIs across different streams

Note

The filter function must be named main and return an object containing the modified data.

Share this doc