Skip to main content

Yellowstone and Carbon: Parse Real-time Solana Program Data

Updated on
May 01, 2025

17 min read

Overview

Monitoring transactions for specific Solana programs can provide valuable insights for development, analytics, or operational purposes. Accessing program data for real-time monitoring can be challenging and making sense of encoded program data can be even more difficult. In this guide, we'll build a transaction monitor for Pump.fun using the Yellowstone gRPC and Carbon, a lightweight indexing framework, to parse the data.

What You Will Do

By the end of this guide, you'll have a Rust application that:

  • Utilizes the Yellowstone gRPC and Carbon indexing framework
  • Connects to a Yellowstone gRPC endpoint
  • Filters and streams Pump.fun to monitor new token mints and token AMM migrations in real-time
  • Processes and logs key information about targetted events

What You Will Need

DependencyVersion
rustc1.85.0
cargo1.85.0
tokio1.43.0
yellowstone-grpc-proto5.0.0
log0.4.25
env_logger0.11.5
carbon-core0.8.0
carbon-yellowstone-grpc-datasource0.8.0
carbon-pumpfun-decoder0.8.0
async-trait0.1.86
dotenv0.15.0

Understanding the Tools

What is Yellowstone gRPC?

Yellowstone gRPC is a high-performance data streaming solution for Solana built on the Geyser plugin system. It offers:

  • Real-time streaming of account updates, transactions, and slot notifications
  • Lower latency compared to traditional WebSocket implementations
  • Strong type safety through gRPC
  • Efficient filtering capabilities for subscriptions

What is Pump.fun?

Pump.fun is a popular token creation platform on Solana that allows anyone to create meme tokens quickly and easily. Monitoring Pump.fun transactions can provide insights into new token launches, market trends, and user activity.

What is Carbon?

Carbon is a lightweight indexing framework for Solana that simplifies the process of capturing and processing on-chain data. It provides a modular pipeline architecture with these key components:

  • Pipeline: The core orchestrator that manages data flow
  • Data sources: Components that provide data updates (transactions, accounts)
  • Pipes: Process specific updates through decoders and processors
  • Decoders: Convert raw blockchain data into structured types
  • Processors: Implement custom logic for the decoded data

Carbon handles the complex details of streaming, decoding, and processing blockchain data, allowing you to focus on building your application logic. At the time of this writing, Carbon offers the following Program Decoders:

Available Program Decoders

Crate NameDescription
carbon-associated-token-account-decoderAssociated Token Account Decoder
carbon-drift-v2-decoderDrift V2 Program Decoder
carbon-fluxbeam-decoderFluxbeam Program Decoder
carbon-jupiter-dca-decoderJupiter DCA Program Decoder
carbon-jupiter-limit-order-decoderJupiter Limit Order Program Decoder
carbon-jupiter-limit-order-2-decoderJupiter Limit Order 2 Program Decoder
carbon-jupiter-perpetuals-decoderJupiter Perpetuals Program Decoder
carbon-jupiter-swap-decoderJupiter Swap Program Decoder
carbon-kamino-farms-decoderKamino Farms Program Decoder
carbon-kamino-lending-decoderKamino Lend Decoder
carbon-kamino-limit-order-decoderKamino Limit Order Program Decoder
carbon-kamino-vault-decoderKamino Vault Decoder
carbon-lifinity-amm-v2-decoderLifinity AMM V2 Program Decoder
carbon-marginfi-v2-decoderMarginfi V2 Program Decoder
carbon-marinade-finance-decoderMarinade Finance Program Decoder
carbon-memo-program-decoderSPL Memo Program Decoder
carbon-meteora-dlmm-decoderMeteora DLMM Program Decoder
carbon-meteora-pools-decoderMeteora Pools Program Decoder
carbon-moonshot-decoderMoonshot Program Decoder
carbon-mpl-core-decoderMPL Core Program Decoder
carbon-mpl-token-metadata-decoderMPL Token Metadata Program Decoder
carbon-name-service-decoderSPL Name Service Program Decoder
carbon-okx-dex-decoderOKX DEX Decoder
carbon-openbook-v2-decoderOpenbook V2 Program Decoder
carbon-orca-whirlpool-decoderOrca Whirlpool Program Decoder
carbon-phoenix-v1-decoderPhoenix V1 Program Decoder
carbon-pumpfun-decoderPumpfun Program Decoder
carbon-pump-swap-decoderPumpSwap Program Decoder
carbon-raydium-amm-v4-decoderRaydium AMM V4 Program Decoder
carbon-raydium-clmm-decoderRaydium CLMM Program Decoder
carbon-raydium-cpmm-decoderRaydium CPMM Program Decoder
carbon-raydium-launchpad-decoderRaydium Launchpad Program Decoder
carbon-raydium-liquidity-locking-decoderRaydium Liquidity Locking Program Decoder
carbon-sharky-decoderSharkyFi Decoder
carbon-solayer-pool-restaking-decoderSolayer Pool Restaking Program Decoder
carbon-stabble-stable-swap-decoderStabble Stable Swap Decoder
carbon-stabble-weighted-swap-decoderStabble Weighted Swap Decoder
carbon-stake-program-decoderStake Program Decoder
carbon-system-program-decoderSystem Program Decoder
carbon-token-2022-decoderToken 2022 Program Decoder
carbon-token-program-decoderToken Program Decoder
carbon-virtual-curve-decoderMeteora Virtual Curve Program Decoder
carbon-virtuals-decoderVirtuals Program Decoder
carbon-zeta-decoderZeta Program Decoder

Check the Carbon documentation for the latest updates and additional decoders. For this guide, we will use carbon-pumpfun-decoder to decode the Pump.fun program instructions--after completion of this guide, you should be able to utilize any of the other decoders in a similar manner.

Project Setup

Let's start by setting up our Rust project structure.

1. Create a New Project

cargo new pump-fun-carbon
cd pump-fun-carbon

2. Configure Dependencies

Replace the contents of your Cargo.toml file with:

[package]
name = "pump-fun-carbon"
version = "0.1.0"
edition = "2024"

[dependencies]
# Yellowstone & Carbon dependencies
carbon-core = "0.8.0"
carbon-pumpfun-decoder = "0.8.0"
carbon-yellowstone-grpc-datasource = "0.8.0"
yellowstone-grpc-proto = "5.0.0"

# Async and utilities
async-trait = "0.1.86"
tokio = { version = "1.43.0", features = ["full"] }
dotenv = "0.15.0"
env_logger = "0.11.5"
log = "0.4.25"

3. Create Environment Configuration

Create a .env file in your project root to store your Yellowstone gRPC endpoint details:

GEYSER_URL=your-quicknode-yellowstone-grpc-endpoint
X_TOKEN=your-quicknode-x-token

Replace the placeholder values with your actual QuickNode Yellowstone gRPC endpoint and authentication token. You can find information on configuring your endpoint in our docs, here.

Implementation

Now, let's build our Pump.fun transaction monitor. When you initialized the project, you should have had a src directory with a main.rs file. We'll be adding our code to this file. If you do not have one, go ahead and create it.

1. Create Constants and Imports

Let's start by importing the necessary libraries and defining some constants for our application:

use {
async_trait::async_trait,
carbon_core::{
deserialize::ArrangeAccounts,
error::CarbonResult,
instruction::{DecodedInstruction, InstructionMetadata, NestedInstructions},
metrics::MetricsCollection,
processor::Processor,
},
carbon_pumpfun_decoder::{
PROGRAM_ID as PUMP_FUN_PROGRAM_ID, PumpfunDecoder,
instructions::{PumpfunInstruction, create::Create, migrate::Migrate},
},
carbon_yellowstone_grpc_datasource::YellowstoneGrpcGeyserClient,
std::{
collections::{HashMap, HashSet},
env,
sync::Arc,
},
tokio::sync::RwLock,
yellowstone_grpc_proto::geyser::{
CommitmentLevel, SubscribeRequestFilterAccounts, SubscribeRequestFilterTransactions,
},
};

// Pump.fun authority addresses to monitor
const PUMP_FUN_MINT_AUTHORITY: &str = "TSLvdd1pWpHVjahSpsvCXUbgwsL3JAcvokwaKt1eokM";
const PUMP_FUN_MIGRATION_AUTHORITY: &str = "39azUYFWPz3VHgKCf3VChUwbpURdCHRxjWVowf5jUJjg";

In this code, we are importing the necessary modules from Carbon and the Pump.fun decoder. We also define constants for the Pump.fun mint and migration authorities, which we will use to filter transactions--this will help us reduce the amount of data we need to process and focus on the relevant transactions (more on this in a bit).

2. Implement the Main Function

Next, let's implement the main function that serves as the entry point for our application. Add the following code to your main.rs file (we will walk through the code afterwards):

#[tokio::main]
pub async fn main() -> CarbonResult<()> {
// 1 - Initialize logging
unsafe {
std::env::set_var("RUST_LOG", "info");
}
env_logger::init();
log::info!("Starting Pumpfun Transaction Processor Using Carbon");
// 2 - Check environment variables
dotenv::dotenv().ok();
let geyser_url = match env::var("GEYSER_URL") {
Ok(url) if !url.is_empty() => url,
_ => {
log::error!("GEYSER_URL environment variable not set or empty");
return Err(carbon_core::error::Error::Custom(
"GEYSER_URL not set".into(),
));
}
};

log::info!("Using GEYSER_URL: {}", geyser_url);

let x_token = env::var("X_TOKEN").ok();
log::info!(
"X_TOKEN is {}",
if x_token.is_some() { "set" } else { "not set" }
);

// 3 - Initialize account filters
let account_filters: HashMap<String, SubscribeRequestFilterAccounts> = HashMap::new();

// 4 - Initialize transaction filter
let transaction_filter = SubscribeRequestFilterTransactions {
vote: Some(false),
failed: Some(false),
account_include: vec![
PUMP_FUN_MINT_AUTHORITY.to_string(),
PUMP_FUN_MIGRATION_AUTHORITY.to_string(),
],
account_exclude: vec![],
account_required: vec![PUMP_FUN_PROGRAM_ID.to_string()],
signature: None,
};

let mut transaction_filters: HashMap<String, SubscribeRequestFilterTransactions> =
HashMap::new();

transaction_filters.insert(
"raydium_launchpad_transaction_filter".to_string(),
transaction_filter,
);

// 5 - Initialize Yellowstone Grpc Geyser Client
let yellowstone_grpc = YellowstoneGrpcGeyserClient::new(
env::var("GEYSER_URL").unwrap_or_default(),
env::var("X_TOKEN").ok(),
Some(CommitmentLevel::Processed),
account_filters,
transaction_filters,
Arc::new(RwLock::new(HashSet::new())),
);

// 6 - Build and run the Carbon pipeline
carbon_core::pipeline::Pipeline::builder()
.datasource(yellowstone_grpc)
.instruction(PumpfunDecoder, PumpfunInstructionProcessor)
//.account(PumpfunDecoder, PumpfunAccountProcessor)
.shutdown_strategy(carbon_core::pipeline::ShutdownStrategy::Immediate)
.build()?
.run()
.await?;

Ok(())
}

Let's break down what this code does:

  1. Initialize Logging - We set the log level to "info" and initialize the logger to see informative messages in our console output.

  2. Check Environment Variables - We load variables from the .env file and validate that the required GEYSER_URL is set. We also check if an X_TOKEN is available for authentication.

  3. Initialize Account Filters - We create an empty HashMap for account filters. In this implementation, we're not filtering by specific accounts, but the structure is prepared for future expansion (for example, if you wanted to monitor changes to a specific LP account rather than transaction events).

  4. Initialize Transaction Filter - We create a filter to specifically target Pump.fun transactions:

    • Exclude vote transactions and failed transactions
    • Include transactions involving either the mint authority or migration authority. Without this filter, we would receive all transactions for the Pump.fun program, including buys and sells--which would result in a lot of noise unrelated to our monitoring goal and ultimately use more credits. Because we know that the mint authority is only used for creating new tokens and the migration authority is only used for migrating tokens, we are now ensuring our monitor only receives the transactions with the instructions we care about.
    • Require the Pump.fun program ID to be present
    • No specific accounts to exclude or signatures to match (although you could consider using this field to filter malicious creators or other unwanted accounts)
  5. Initialize Yellowstone gRPC Client - We create the client that will connect to the Yellowstone gRPC endpoint with our configured filters. We are using Processed commitment level to ensure we receive the most up-to-date data. We have commented out an account processor since we are not using it in this example, but you can add it back in if you want to monitor account changes as well.

  6. Build and Run Carbon Pipeline - Finally, we build the Carbon pipeline that:

    • Uses the Yellowstone gRPC client as a data source
    • Connects the Pump.fun decoder to our processor
    • Sets an immediate shutdown strategy
    • Builds and runs the pipeline

Great! Now we just need to define and implement the PumpfunInstructionProcessor that will handle the decoded Pump.fun instructions.

3. Implement the Instruction Processor

Below your main function, let's implement the PumpfunInstructionProcessor that will process the decoded Pump.fun instructions:

pub struct PumpfunInstructionProcessor;
#[async_trait]
impl Processor for PumpfunInstructionProcessor {
type InputType = (
InstructionMetadata,
DecodedInstruction<PumpfunInstruction>,
NestedInstructions,
);

async fn process(
&mut self,
(metadata, instruction, _nested_instructions): Self::InputType,
_metrics: Arc<MetricsCollection>,
) -> CarbonResult<()> {
let signature = metadata.transaction_metadata.signature;
let accounts = instruction.accounts;

match instruction.data {
PumpfunInstruction::Sell(_params) => {
log::info!("❌ - EXPECT TO NEVER SEE SELL INSTRUCTION");
log::info!("TxId: {}", signature);
}
PumpfunInstruction::Create(params) => {
log::info!("💊 NEW PUMPFUN TOKEN CREATED");
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
log::info!("TxId: {}", signature);
log::info!("Token Details:");
log::info!(" • Name: {}", params.name);
log::info!(" • Symbol: {}", params.symbol);

match Create::arrange_accounts(&accounts) {
Some(arranged_accounts) => {
log::info!(" • Mint: {}", arranged_accounts.mint);
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
}
None => {
log::warn!(
"Failed to arrange accounts for Create instruction. Create instruction: signature: {signature}"
);
}
}
}

PumpfunInstruction::Migrate(_params) => {
log::info!("🔄 MIGRATE INSTRUCTION DETECTED");
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
log::info!("TxId: {}", signature);
match Migrate::arrange_accounts(&accounts) {
Some(arranged_accounts) => {
log::info!(" • Mint: {}", arranged_accounts.mint);
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
}
None => {
log::warn!(
"Failed to arrange accounts for Migrate instruction. Migrate instruction: signature: {signature}"
);
}
}
}
_ => {
// Ignore non-target instructions
}
}

Ok(())
}
}

Let's break down this instruction processor:

  1. Struct and Trait Implementation - We define a PumpfunInstructionProcessor struct that implements the Carbon Processor trait, which is responsible for handling decoded instructions. From the Carbon documentation, "The Processor trait provides a single asynchronous method, process, which is responsible for handling data of a specific type (InputType). This trait is designed to be implemented by types that need to process data within the pipeline, allowing for customized handling of different data types.":
#[async_trait]
pub trait Processor {
type InputType;

async fn process(
&mut self,
data: Self::InputType,
metrics: Arc<MetricsCollection>,
) -> CarbonResult<()>;
}
  1. Input Type Definition - The InputType is a tuple containing three elements:

    • InstructionMetadata: Contains transaction context information
    • DecodedInstruction<PumpfunInstruction>: The decoded Pump.fun instruction
    • NestedInstructions: Any nested inner instructions, like CPI calls
  2. Process Method - This is where the actual processing logic happens:

    • We extract the transaction signature and account list
    • We match against different instruction types to handle each one appropriately
  3. Instruction Handlers:

    • Sell Instruction - This is included for demonstration purposes only to help illutrate how our account_include filter is working. Our transaction filters specify that the target transactions must include the mint or migration authority accounts--neither of which are present in sell instructions. Therefore, this code path should never execute. The log message "EXPECT TO NEVER SEE SELL INSTRUCTION" reinforces this. If you wanted to, you could remove this code entirely -- or modify the filters to include sell instructions if you wanted to monitor them as well.

    • Create Instruction - When a new Pump.fun token is created:

      • We log the transaction and format it for readability
      • We extract token details from the params like name and symbol
      • We use arrange_accounts to get structured access to the accounts involved, so that we can log the mint address
    • Migrate Instruction - When a token migration occurs, we log similar information as with Create

    • Other Instructions - Any other Pump.fun instructions are ignored

  4. Account Arrangement - For both Create and Migrate instructions, we use the arrange_accounts helper method. This Carbon function:

    • Takes the raw list of accounts
    • Returns them in a structured object that makes it easy to access specific accounts by name
    • Returns None if the accounts don't match the expected pattern

This processor design demonstrates a key strength of Carbon - the ability to work with decoded instruction data in a strongly typed way, rather than dealing with raw bytes. The PumpfunDecoder (which is provided by the carbon-pumpfun-decoder crate) handles the complex task of deserializing raw instruction data, allowing our processor to focus on business logic. You can use your IDE's intelisense features to explore the PumpfunInstruction enum and see all the possible instructions that can be decoded.

Running the Application

To run your Pump.fun transaction monitor:

  1. Make sure your .env file is set up with valid Yellowstone gRPC credentials
  2. Build the application:
cargo build
  1. Run the application:
cargo run

You should see output like:

[INFO] Starting Pumpfun Transaction Processor Using Carbon
[INFO] Using GEYSER_URL: your-geyser-url
[INFO] X_TOKEN is set

When a new Pump.fun token is created, you'll see output like:

[INFO] 💊 NEW PUMPFUN TOKEN CREATED
[INFO] ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
[INFO] TxId: 5QrTR9...truncated...
[INFO] Token Details:
[INFO] • Name: Awesome Token
[INFO] • Symbol: AWSM
[INFO] • Mint: AWsM4rKh...truncated...
[INFO] ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Great job!

Expanding Your Monitor

Here are some ways you might expand this basic monitor:

  1. Database Integration: Store transaction data in a database for analytics
  2. Expand Filters: Add more filters to target additional instruction types
  3. Multiple Program Monitoring: Define and insert additional transaction filters to monitor multiple programs and utilize multiple decoders
  4. Automate Trading: Leverage the Metis Swap API with Pump.fun support to build and implement automated trading strategies

Wrap Up

In this guide, we've built a Pump.fun transaction monitor using Carbon and Yellowstone gRPC. This approach provides high-performance, real-time monitoring with minimal code. The Carbon framework abstracts away much of the complexity of parsing Solana transaction and account data, allowing you to focus on your application logic.

If you have any questions or need help with your Solana development projects, join us in the QuickNode Discord or follow us on Twitter.

We ❤️ Feedback!

Let us know if you have any feedback or requests for new topics. We'd love to hear from you.

Additional Resources

Share this guide