Overview
Streams is a data streaming solution designed for web3 applications. It offers a variety of features, including the retrieval of real-time and historical blockchain data to your preferred destination. The primary goal of Streams is to connect web3 transactional data with a native and highly efficient streaming interface.
Why Streams?
In specific scenarios, using JSON-RPC involves continuous data polling, error handling, retries, and chain reorganizations. While this approach is standard, it proves unmanageable for transactional web3 data, leading to inefficiencies and resource consumption. Streams offer an alternative by adopting a push model for data delivery. This event-driven methodology ensures real-time, efficient, and consistent data transmission, addressing the drawbacks associated with traditional polling.
Streams revolutionize handling web3 data by pushing it to users, facilitating seamless integration with data lakes and warehouses. It guarantees exactly-once data delivery in the order of dataset on-chain finality. Streams' guarantee of data delivery sets a foundation for reliable, accurate, and timely blockchain data.
Streams Features
- Streamlined Integration - With the user-friendly interface, Streams simplifies the integration and management of blockchain data and make it accessible with just a few clicks.
- Reliable Data Delivery - Streams ensures exactly-once, guaranteed delivery, seamlessly integrating with your data lake. Streams ensures every block, receipt, or trace is delivered exactly-once in the order of dataset finality, preventing issues like corrupt or missing data.
- Real-Time Data Consistency - Streams ensures the delivery of consistent, real-time data for reliable live dashboards.
- Efficient Historical Data Handling - With Streams, you can efficiently configure large data batches, set precise date ranges, and fine-tune destinations for streamlined historical data management.
- Transparent User Experience - With Streams, you can access transparency through operational logs and performance metrics. The Stream dashboard ensures clear billing and usage tracking for effective financial management, empowering users to allocate resources wisely and optimize their investment in web3 data infrastructure.
Streams Lifecycle
Streams are data pipelines that fetch and deliver blockchain data to your desired destination. The data can range from transaction logs, block data, and other relevant datasets. You can start and stop streams at any time. When a stream is active, it consistently proceeds from the last delivered data range reference. A stream lifecycle consist of following stages:
- Active: Streams fetches and sends data as per the defined configurations.
- Paused: At any time, you may halt data retrieval and delivery temporarily.
- Terminated: Data streaming has been terminated due to exhausted delivery attempts or other errors.
- Completed: The streams successfully ends after covering the defined range.
Feature Availability
Streams is available to all users with a QuickNode plan. For teams with unique requirements, we offer tailored datasets, dedicated support, and custom integrations. Contact our team for more information.
Feature | Free / Discover | Starter / Discover+ | Build / Growth | Scale / Business / ENT |
---|---|---|---|---|
Included GB | 10 / 3 | 15 / 5 | 10 / 20 | 20 / 50 / Custom |
Data Tier 1 Cost | - / - | $10.00 | $9.75 | $9.50 / $9.50 / Custom |
Data Tier 1 GB Allotment | - / - | 250GB | 225GB | 200GB / 200GB / Custom |
Data Tier 2 Cost | - / - | $9.50 | $9.00 | $8.00 / $8.00 / Custom |
Filters | ✅ | ✅ | ✅ | ✅ |
Compression | ❌ | ❌ | ❌ | ✅ |
Active Streams Limit | 1 | 3 | 5 | Unlimited |
MPS | 30 | 30 | 150 | 1000 / 1000 / 3000 |
Reorg Handling | ❌ | ❌ | Latest block delay | + Real-time reorg handling |
Datasets | Blocks, Transactions, Logs, Receipts | Blocks, Transactions, Logs, Receipts | Blocks, Transactions, Logs, Receipts, Traces | Same as Build / Same as Build / Custom |
Supported Destinations | Webhooks, Functions | Webhooks, Functions | Webhooks, Functions | + S3-compatible storage, Snowflake, PostgreSQL / Same / + Kafka, Direct database integrations* |
* Direct database integrations include SQL databases, MongoDB, Redis, Snowflake, Clickhouse
Access
Access Streams through the QuickNode Developer Portal and via Streams REST API.
Supported Chains
Streams streaming can be quickly provisioned for any of the supported chain and networks mentioned below. For teams with unique requirements contact our team for more information. Otherwise, our self-serve experience has you covered on the following chains:
Chain | Mainnet | Testnets |
---|---|---|
Arbitrum | ✅ | Sepolia |
Arbitrum Nova | ✅ | |
Avalanche C-Chain | ✅ | Fuji |
Base | ✅ | Sepolia |
Bera | Coming soon | Artio, bArtio |
Bitcoin | ✅ | |
Blast | ✅ | Sepolia |
BNB Smart Chain | ✅ | Testnet |
Camp | Coming soon | Sepolia |
Celo | ✅ | |
Cyber | ✅ | Sepolia |
Ethereum | ✅ | Holesky, Sepolia |
Fantom | ✅ | |
Fraxtal | ✅ | |
Gnosis | ✅ | |
Immutable zkEVM | ✅ | Testnet |
Kaia | ✅ | Testnet |
Mantle | ✅ | Sepolia |
Mode | ✅ | |
Morph | Coming soon | Holesky |
Omni | Coming soon | Omega |
Optimism | ✅ | Sepolia |
Polygon | ✅ | Amoy |
Polygon zkEVM | ✅ | |
Race | ✅ | Testnet |
Redstone | ✅ | |
Scroll | ✅ | Testnet |
Story | Coming soon | Testnet |
Tron | ✅ | |
Xai | ✅ | Sepolia |
zkSync | ✅ | Sepolia |
Zora | ✅ |
Data Fetch Limits
Streams data fetch limits are tailored to accommodate the diverse needs of our users across various blockchain networks. These limits are designed to ensure optimal performance and equitable access to datasets. Below, you'll find the specific data fetch limits for each chain and network, segmented by QuickNode plans.
Understanding the Limits
- Messages per Second (mps): This metric represents the maximum number of dataset messages you can fetch per second, according to your subscription plan.
- Chain-Specific Overrides: Some chains may have specific limits that differ from the standard rates due to their unique characteristics or the infrastructure requirements they entail. We are committed to continually striving to increase the MPS rates for every chain supported on Streams.
It's important to remember that the total data fetching speed is capped by a defined limit for each chain, based on your subscription plan. This means the combined throughput—measured in Messages per Second (mps) for all your streams targeting a single blockchain network cannot exceed your plan's rate limit.
- Starter Plan: up to 50 mps across all streams on a single chain.
- Growth Plan: up to 125 mps across all streams on a single chain.
- Business Plan: up to 500 mps for all combined streams on one chain.
- Business+ & Enterprise Plan: 3000 mps, up to 100,000 mps, with the total speed spread across all streams on a single chain.
Tips for Managing Limits and Rates
- Destination Tuning: The efficiency of your data stream is significantly impacted by how well-tuned your destination is. Ensure that your data lakes, warehouses, or other destinations are optimized to handle the incoming data streams. This may involve adjusting settings for data ingestion, storage, and processing capacities to match the throughput capabilities of your Streams plan.
- Geographical Proximity: The rate at which you can fetch data may also depend on how close you are to the region where Streams is deployed. Data retrieval times can be optimized by selecting a deployment region that is geographically closer to you or your destination.
- Monitor Performance: Regularly check your Streams dashboard to monitor your usage against your limits. Adjust your streams as needed to ensure continuous, uninterrupted access to data.
- Upgrade for More Capacity: If you find your data needs increasing beyond what your current plan allows, consider upgrading to a higher plan. This will not only increase your fetch limits but also potentially offer additional features and capabilities to support your growing requirements.
We ❤️ Feedback!
If you have any feedback or questions about this documentation, let us know. We'd love to hear from you!