Skip to main content

Getting Started with Streams

Updated on
Jun 26, 2024


Streams is a data streaming solution designed for web3 applications. It offers a variety of features, including the retrieval of real-time and historical blockchain data to your preferred destination. The primary goal of Streams is to connect web3 transactional data with a native and highly efficient streaming interface.

Why Streams?

In specific scenarios, using JSON-RPC involves continuous data polling, error handling, retries, and chain reorganizations. While this approach is standard, it proves unmanageable for transactional web3 data, leading to inefficiencies and resource consumption. Streams offer an alternative by adopting a push model for data delivery. This event-driven methodology ensures real-time, efficient, and consistent data transmission, addressing the drawbacks associated with traditional polling.

Streams revolutionize handling web3 data by pushing it to users, facilitating seamless integration with data lakes and warehouses. It guarantees exactly-once data delivery in the order of dataset on-chain finality. Streams' guarantee of data delivery sets a foundation for reliable, accurate, and timely blockchain data.

Streams Features

  • Streamlined Integration - With the user-friendly interface, Streams simplifies the integration and management of blockchain data and make it accessible with just a few clicks.
  • Reliable Data Delivery - Streams ensures exactly-once, guaranteed delivery, seamlessly integrating with your data lake. Streams ensures every block, receipt, or trace is delivered exactly-once in the order of dataset finality, preventing issues like corrupt or missing data.
  • Real-Time Data Consistency - Streams ensures the delivery of consistent, real-time data for reliable live dashboards.
  • Efficient Historical Data Handling - With Streams, you can efficiently configure large data batches, set precise date ranges, and fine-tune destinations for streamlined historical data management.
  • Transparent User Experience - With Streams, you can access transparency through operational logs and performance metrics. The Stream dashboard ensures clear billing and usage tracking for effective financial management, empowering users to allocate resources wisely and optimize their investment in web3 data infrastructure.

Streams Lifecycle

Streams are data pipelines that fetch and deliver blockchain data to your desired destination. The data can range from transaction logs, block data, and other relevant datasets. You can start and stop streams at any time. When a stream is active, it consistently proceeds from the last delivered data range reference. A stream lifecycle consist of following stages:

  • Active: Streams fetches and sends data as per the defined configurations.
  • Paused: At any time, you may halt data retrieval and delivery temporarily.
  • Terminated: Data streaming has been terminated due to exhausted delivery attempts or other errors.
  • Completed: The streams successfully ends after covering the defined range.

Feature Availability

Streams Beta is available starting from Build plan. For teams with unique requirements, we offer tailored datasets, dedicated support, and custom integrations. Contact our team for more information.

QuickNode planBuildScaleEnterprise
Price per GB$2.50$1.75Custom, tiered pricing*
Supported DestinationsWebhooks, Functions+ S3-compatible storage, Snowflake, PostgreSQL+ Kafka and Direct database integrations: SQL, MongoDB, Redis, Snowflake, Clickhouse*
Reorg handlingLatest block delay+ Real-time reorg handling
DatasetsBlocks, Transactions, Logs, Receipts, TracesSame as Build planCustom datasets


Access Streams through the QuickNode Developer Portal and via Streams REST API.

Supported Chains

Streams streaming can be quickly provisioned for any of the supported chain and networks mentioned below. For teams with unique requirements contact our team for more information. Otherwise, our self-serve experience has you covered on the following chains:

Arbitrum Nova
Avalanche C-ChainFuji
BeraComing soonArtio, bArtio
BNB Smart ChainTestnet
EthereumHolesky, Sepolia
Immutable zkEVMTestnet
Polygon zkEVM

Data Fetch Limits

Streams data fetch limits are tailored to accommodate the diverse needs of our users across various blockchain networks. These limits are designed to ensure optimal performance and equitable access to datasets. Below, you'll find the specific data fetch limits for each chain and network, segmented by QuickNode plans.

Understanding the Limits

  • Messages per Second (mps): This metric represents the maximum number of dataset messages you can fetch per second, according to your subscription plan.
  • Chain-Specific Overrides: Some chains may have specific limits that differ from the standard rates due to their unique characteristics or the infrastructure requirements they entail. We are committed to continually striving to increase the MPS rates for every chain supported on Streams.

While Streams allows you to create unlimited number streams to meet your diverse data needs, it's important to remember that the total data fetching speed is capped by a defined limit for each chain, based on your subscription plan. This means the combined throughput—measured in Messages per Second (mps) for all your streams targeting a single blockchain network—cannot exceed your plan's rate limit.

  • Build Plan: up to 125 mps across all streams on a single chain.
  • Scale Plan: up to 500 mps for all combined streams on one chain.
  • Enterprise Plan: 3000 mps, up to 100,000 mps, with the total speed spread across all streams on a single chain.

See detailed Fetch Limits Table below:

Fetch Limits Table

Chain/NetworkPlanMessages per Second (mps)
General LimitBuild125

Please keep in mind that while Streams is in Beta these limits are subject to change.

Tips for Managing Limits and Rates

  • Destination Tuning: The efficiency of your data stream is significantly impacted by how well-tuned your destination is. Ensure that your data lakes, warehouses, or other destinations are optimized to handle the incoming data streams. This may involve adjusting settings for data ingestion, storage, and processing capacities to match the throughput capabilities of your Streams plan.
  • Geographical Proximity: The rate at which you can fetch data may also depend on how close you are to the region where Streams is deployed. Data retrieval times can be optimized by selecting a deployment region that is geographically closer to you or your destination.
  • Monitor Performance: Regularly check your Streams dashboard to monitor your usage against your limits. Adjust your streams as needed to ensure continuous, uninterrupted access to data.
  • Upgrade for More Capacity: If you find your data needs increasing beyond what your current plan allows, consider upgrading to a higher plan. This will not only increase your fetch limits but also potentially offer additional features and capabilities to support your growing requirements.

Have an Idea?

You can let us know what all new destinations, features, metrics or datasets you want us to support. You can upvote or create a new idea on our Streams Roadmap.

Share this doc