Skip to main content

Azure Blob Storage Destination

Updated on
Apr 10, 2026

Overviewโ€‹

Azure Blob Storage destination allows you to store your Quicknode Streams data directly in Microsoft Azure's cloud storage. This destination is ideal for data warehousing, analytics, and long-term storage of blockchain data.

Prerequisitesโ€‹

Before setting up Azure as a destination, ensure you have:


  • A Microsoft Azure account
  • An Azure Storage Account
  • A Storage Container created
  • A Shared Access Signature (SAS) token with appropriate permissions

Azure Configurationโ€‹

Required Azure Resourcesโ€‹


  1. Storage Account: Your Azure storage account where blobs will be stored
  2. Container: A blob container within your storage account
  3. SAS Token: A Shared Access Signature token with read/write permissions

Creating Azure Resourcesโ€‹

1. Create Storage Accountโ€‹


  1. Log into the Azure Portal
  2. Click Create a resource
  3. Search for Storage account and select it
  4. Fill in the required details:
    • Subscription: Select your subscription
    • Resource group: Create or select existing
    • Storage account name: Choose a unique name
    • Location: Select your preferred region
    • Performance: Standard (recommended)
    • Redundancy: LRS (Locally-redundant storage) or your preferred option
  5. Click Review + create and then Create

2. Create Containerโ€‹


  1. Navigate to your storage account
  2. In the left menu, click Containers
  3. Click + Container
  4. Enter a container name (e.g., blockchain-data)
  5. Set Public access level to Private
  6. Click Create

3. Generate SAS Tokenโ€‹


  1. In your storage account, click Shared access signature
  2. Configure the following settings:
    • Allowed services: Blob
    • Allowed resource types: Container, Object
    • Allowed permissions: Read, Write, Create, Delete
    • Start time: Current time
    • Expiry time: Set appropriate expiration (consider security implications)
  3. Click Generate SAS and connection string
  4. Copy the SAS token (the part after the ? in the connection string)

Stream Configurationโ€‹

Destination Attributesโ€‹

When configuring your stream with Azure destination, you'll need to provide the following attributes:

AttributeTypeRequiredDescription
storage_accountstringYesYour Azure storage account name
containerstringYesThe container name where blobs will be stored
sas_tokenstringYesYour Shared Access Signature token
blob_prefixstringNoPrefix for blob names (e.g., prefix/)
file_compression_typestringYesCompression type for files
file_typestringYesFile format (.json)
max_retrynumberYesMaximum number of retry attempts
retry_interval_secnumberYesInterval between retries in seconds

File Configuration Optionsโ€‹

File Typesโ€‹


  • .json: Human-readable JSON format, good for development and debugging

Compression Typesโ€‹

Choose the compression type that best fits your use case:


  • None: No compression, fastest processing
  • Gzip: Good compression ratio, widely supported

Setup Guideโ€‹

Step 1: Create Your Streamโ€‹

On your Quicknode dashboard, navigate to the Streams page by clicking the Streams tab on the left side-panel. After, click the Create Stream button in the top-right corner. You'll be prompted to first configure your Stream settings:

  1. Basic Configuration:

    • Stream Name: Choose a descriptive name
    • Network: Select your target blockchain network
  2. Dataset Configuration:

    • Dataset: Choose the data type you want to stream
    • Batch messages: Configure how many records to process per batch
    • Modify the stream payload: (Optional) Write JavaScript code to filter/transform your data before delivery
  3. Destination Configuration:

    • Destination Type: Select Azure
    • Storage account: Enter your Azure storage account name
    • File compression: Select compression type
    • SAS token: Paste your SAS token
    • Container: Enter your container name
    • Blob prefix: Set your desired prefix (e.g., prefix/)
    • Retry wait period: Set retry delay in seconds (default: 1)
    • Pause stream after: Set retry attempts (default: 3)

Step 2: Activate Your Streamโ€‹

Review your configuration and click Create Stream. Your stream will begin processing data and storing it in Azure Blob Storage.

Example Configurationโ€‹

JSON Configuration Exampleโ€‹

{
"name": "Ethereum Block Stream",
"network": "ethereum-mainnet",
"dataset": "block",
"destination": "azure",
"destination_attributes": {
"storage_account": "myblockchainstorage",
"container": "ethereum-data",
"sas_token": "sv=2020-08-04&ss=b&srt=sco&sp=rwdlacupitx&se=2024-01-01T00:00:00Z&st=2023-01-01T00:00:00Z&spr=https&sig=...",
"blob_prefix": "ethereum/mainnet/blocks/",
"file_compression_type": "gzip",
"file_type": ".json",
"max_retry": 3,
"retry_interval_sec": 2
},
"dataset_batch_size": 100,
"elastic_batch_enabled": true,
"status": "active"
}
Share this doc