17 min read
Overview
Monitoring transactions for specific Solana programs can provide valuable insights for development, analytics, or operational purposes. Accessing program data for real-time monitoring can be challenging and making sense of encoded program data can be even more difficult. In this guide, we'll build a transaction monitor for Pump.fun using the Yellowstone gRPC and Carbon, a lightweight indexing framework, to parse the data.
What You Will Do
By the end of this guide, you'll have a Rust application that:
- Utilizes the Yellowstone gRPC and Carbon indexing framework
- Connects to a Yellowstone gRPC endpoint
- Filters and streams Pump.fun to monitor new token mints and token AMM migrations in real-time
- Processes and logs key information about targetted events
What You Will Need
- Rust and Cargo installed on your system
- A QuickNode account with the Yellowstone gRPC add-on enabled
- Basic understanding of Solana and Rust programming
- Familiarity with command-line interfaces
Dependency | Version |
---|---|
rustc | 1.85.0 |
cargo | 1.85.0 |
tokio | 1.43.0 |
yellowstone-grpc-proto | 5.0.0 |
log | 0.4.25 |
env_logger | 0.11.5 |
carbon-core | 0.8.0 |
carbon-yellowstone-grpc-datasource | 0.8.0 |
carbon-pumpfun-decoder | 0.8.0 |
async-trait | 0.1.86 |
dotenv | 0.15.0 |
Understanding the Tools
What is Yellowstone gRPC?
Yellowstone gRPC is a high-performance data streaming solution for Solana built on the Geyser plugin system. It offers:
- Real-time streaming of account updates, transactions, and slot notifications
- Lower latency compared to traditional WebSocket implementations
- Strong type safety through gRPC
- Efficient filtering capabilities for subscriptions
What is Pump.fun?
Pump.fun is a popular token creation platform on Solana that allows anyone to create meme tokens quickly and easily. Monitoring Pump.fun transactions can provide insights into new token launches, market trends, and user activity.
What is Carbon?
Carbon is a lightweight indexing framework for Solana that simplifies the process of capturing and processing on-chain data. It provides a modular pipeline architecture with these key components:
- Pipeline: The core orchestrator that manages data flow
- Data sources: Components that provide data updates (transactions, accounts)
- Pipes: Process specific updates through decoders and processors
- Decoders: Convert raw blockchain data into structured types
- Processors: Implement custom logic for the decoded data
Carbon handles the complex details of streaming, decoding, and processing blockchain data, allowing you to focus on building your application logic. At the time of this writing, Carbon offers the following Program Decoders:
Available Program Decoders
Crate Name | Description |
---|---|
carbon-associated-token-account-decoder | Associated Token Account Decoder |
carbon-drift-v2-decoder | Drift V2 Program Decoder |
carbon-fluxbeam-decoder | Fluxbeam Program Decoder |
carbon-jupiter-dca-decoder | Jupiter DCA Program Decoder |
carbon-jupiter-limit-order-decoder | Jupiter Limit Order Program Decoder |
carbon-jupiter-limit-order-2-decoder | Jupiter Limit Order 2 Program Decoder |
carbon-jupiter-perpetuals-decoder | Jupiter Perpetuals Program Decoder |
carbon-jupiter-swap-decoder | Jupiter Swap Program Decoder |
carbon-kamino-farms-decoder | Kamino Farms Program Decoder |
carbon-kamino-lending-decoder | Kamino Lend Decoder |
carbon-kamino-limit-order-decoder | Kamino Limit Order Program Decoder |
carbon-kamino-vault-decoder | Kamino Vault Decoder |
carbon-lifinity-amm-v2-decoder | Lifinity AMM V2 Program Decoder |
carbon-marginfi-v2-decoder | Marginfi V2 Program Decoder |
carbon-marinade-finance-decoder | Marinade Finance Program Decoder |
carbon-memo-program-decoder | SPL Memo Program Decoder |
carbon-meteora-dlmm-decoder | Meteora DLMM Program Decoder |
carbon-meteora-pools-decoder | Meteora Pools Program Decoder |
carbon-moonshot-decoder | Moonshot Program Decoder |
carbon-mpl-core-decoder | MPL Core Program Decoder |
carbon-mpl-token-metadata-decoder | MPL Token Metadata Program Decoder |
carbon-name-service-decoder | SPL Name Service Program Decoder |
carbon-okx-dex-decoder | OKX DEX Decoder |
carbon-openbook-v2-decoder | Openbook V2 Program Decoder |
carbon-orca-whirlpool-decoder | Orca Whirlpool Program Decoder |
carbon-phoenix-v1-decoder | Phoenix V1 Program Decoder |
carbon-pumpfun-decoder | Pumpfun Program Decoder |
carbon-pump-swap-decoder | PumpSwap Program Decoder |
carbon-raydium-amm-v4-decoder | Raydium AMM V4 Program Decoder |
carbon-raydium-clmm-decoder | Raydium CLMM Program Decoder |
carbon-raydium-cpmm-decoder | Raydium CPMM Program Decoder |
carbon-raydium-launchpad-decoder | Raydium Launchpad Program Decoder |
carbon-raydium-liquidity-locking-decoder | Raydium Liquidity Locking Program Decoder |
carbon-sharky-decoder | SharkyFi Decoder |
carbon-solayer-pool-restaking-decoder | Solayer Pool Restaking Program Decoder |
carbon-stabble-stable-swap-decoder | Stabble Stable Swap Decoder |
carbon-stabble-weighted-swap-decoder | Stabble Weighted Swap Decoder |
carbon-stake-program-decoder | Stake Program Decoder |
carbon-system-program-decoder | System Program Decoder |
carbon-token-2022-decoder | Token 2022 Program Decoder |
carbon-token-program-decoder | Token Program Decoder |
carbon-virtual-curve-decoder | Meteora Virtual Curve Program Decoder |
carbon-virtuals-decoder | Virtuals Program Decoder |
carbon-zeta-decoder | Zeta Program Decoder |
Check the Carbon documentation for the latest updates and additional decoders. For this guide, we will use carbon-pumpfun-decoder
to decode the Pump.fun program instructions--after completion of this guide, you should be able to utilize any of the other decoders in a similar manner.
Project Setup
Let's start by setting up our Rust project structure.
1. Create a New Project
cargo new pump-fun-carbon
cd pump-fun-carbon
2. Configure Dependencies
Replace the contents of your Cargo.toml
file with:
[package]
name = "pump-fun-carbon"
version = "0.1.0"
edition = "2024"
[dependencies]
# Yellowstone & Carbon dependencies
carbon-core = "0.8.0"
carbon-pumpfun-decoder = "0.8.0"
carbon-yellowstone-grpc-datasource = "0.8.0"
yellowstone-grpc-proto = "5.0.0"
# Async and utilities
async-trait = "0.1.86"
tokio = { version = "1.43.0", features = ["full"] }
dotenv = "0.15.0"
env_logger = "0.11.5"
log = "0.4.25"
3. Create Environment Configuration
Create a .env
file in your project root to store your Yellowstone gRPC endpoint details:
GEYSER_URL=your-quicknode-yellowstone-grpc-endpoint
X_TOKEN=your-quicknode-x-token
Replace the placeholder values with your actual QuickNode Yellowstone gRPC endpoint and authentication token. You can find information on configuring your endpoint in our docs, here.
Implementation
Now, let's build our Pump.fun transaction monitor. When you initialized the project, you should have had a src
directory with a main.rs
file. We'll be adding our code to this file. If you do not have one, go ahead and create it.
1. Create Constants and Imports
Let's start by importing the necessary libraries and defining some constants for our application:
use {
async_trait::async_trait,
carbon_core::{
deserialize::ArrangeAccounts,
error::CarbonResult,
instruction::{DecodedInstruction, InstructionMetadata, NestedInstructions},
metrics::MetricsCollection,
processor::Processor,
},
carbon_pumpfun_decoder::{
PROGRAM_ID as PUMP_FUN_PROGRAM_ID, PumpfunDecoder,
instructions::{PumpfunInstruction, create::Create, migrate::Migrate},
},
carbon_yellowstone_grpc_datasource::YellowstoneGrpcGeyserClient,
std::{
collections::{HashMap, HashSet},
env,
sync::Arc,
},
tokio::sync::RwLock,
yellowstone_grpc_proto::geyser::{
CommitmentLevel, SubscribeRequestFilterAccounts, SubscribeRequestFilterTransactions,
},
};
// Pump.fun authority addresses to monitor
const PUMP_FUN_MINT_AUTHORITY: &str = "TSLvdd1pWpHVjahSpsvCXUbgwsL3JAcvokwaKt1eokM";
const PUMP_FUN_MIGRATION_AUTHORITY: &str = "39azUYFWPz3VHgKCf3VChUwbpURdCHRxjWVowf5jUJjg";
In this code, we are importing the necessary modules from Carbon and the Pump.fun decoder. We also define constants for the Pump.fun mint and migration authorities, which we will use to filter transactions--this will help us reduce the amount of data we need to process and focus on the relevant transactions (more on this in a bit).
2. Implement the Main Function
Next, let's implement the main
function that serves as the entry point for our application. Add the following code to your main.rs
file (we will walk through the code afterwards):
#[tokio::main]
pub async fn main() -> CarbonResult<()> {
// 1 - Initialize logging
unsafe {
std::env::set_var("RUST_LOG", "info");
}
env_logger::init();
log::info!("Starting Pumpfun Transaction Processor Using Carbon");
// 2 - Check environment variables
dotenv::dotenv().ok();
let geyser_url = match env::var("GEYSER_URL") {
Ok(url) if !url.is_empty() => url,
_ => {
log::error!("GEYSER_URL environment variable not set or empty");
return Err(carbon_core::error::Error::Custom(
"GEYSER_URL not set".into(),
));
}
};
log::info!("Using GEYSER_URL: {}", geyser_url);
let x_token = env::var("X_TOKEN").ok();
log::info!(
"X_TOKEN is {}",
if x_token.is_some() { "set" } else { "not set" }
);
// 3 - Initialize account filters
let account_filters: HashMap<String, SubscribeRequestFilterAccounts> = HashMap::new();
// 4 - Initialize transaction filter
let transaction_filter = SubscribeRequestFilterTransactions {
vote: Some(false),
failed: Some(false),
account_include: vec![
PUMP_FUN_MINT_AUTHORITY.to_string(),
PUMP_FUN_MIGRATION_AUTHORITY.to_string(),
],
account_exclude: vec![],
account_required: vec![PUMP_FUN_PROGRAM_ID.to_string()],
signature: None,
};
let mut transaction_filters: HashMap<String, SubscribeRequestFilterTransactions> =
HashMap::new();
transaction_filters.insert(
"raydium_launchpad_transaction_filter".to_string(),
transaction_filter,
);
// 5 - Initialize Yellowstone Grpc Geyser Client
let yellowstone_grpc = YellowstoneGrpcGeyserClient::new(
env::var("GEYSER_URL").unwrap_or_default(),
env::var("X_TOKEN").ok(),
Some(CommitmentLevel::Processed),
account_filters,
transaction_filters,
Arc::new(RwLock::new(HashSet::new())),
);
// 6 - Build and run the Carbon pipeline
carbon_core::pipeline::Pipeline::builder()
.datasource(yellowstone_grpc)
.instruction(PumpfunDecoder, PumpfunInstructionProcessor)
//.account(PumpfunDecoder, PumpfunAccountProcessor)
.shutdown_strategy(carbon_core::pipeline::ShutdownStrategy::Immediate)
.build()?
.run()
.await?;
Ok(())
}
Let's break down what this code does:
-
Initialize Logging - We set the log level to "info" and initialize the logger to see informative messages in our console output.
-
Check Environment Variables - We load variables from the
.env
file and validate that the requiredGEYSER_URL
is set. We also check if anX_TOKEN
is available for authentication. -
Initialize Account Filters - We create an empty HashMap for account filters. In this implementation, we're not filtering by specific accounts, but the structure is prepared for future expansion (for example, if you wanted to monitor changes to a specific LP account rather than transaction events).
-
Initialize Transaction Filter - We create a filter to specifically target Pump.fun transactions:
- Exclude vote transactions and failed transactions
- Include transactions involving either the mint authority or migration authority. Without this filter, we would receive all transactions for the Pump.fun program, including buys and sells--which would result in a lot of noise unrelated to our monitoring goal and ultimately use more credits. Because we know that the mint authority is only used for creating new tokens and the migration authority is only used for migrating tokens, we are now ensuring our monitor only receives the transactions with the instructions we care about.
- Require the Pump.fun program ID to be present
- No specific accounts to exclude or signatures to match (although you could consider using this field to filter malicious creators or other unwanted accounts)
-
Initialize Yellowstone gRPC Client - We create the client that will connect to the Yellowstone gRPC endpoint with our configured filters. We are using
Processed
commitment level to ensure we receive the most up-to-date data. We have commented out an account processor since we are not using it in this example, but you can add it back in if you want to monitor account changes as well. -
Build and Run Carbon Pipeline - Finally, we build the Carbon pipeline that:
- Uses the Yellowstone gRPC client as a data source
- Connects the Pump.fun decoder to our processor
- Sets an immediate shutdown strategy
- Builds and runs the pipeline
Great! Now we just need to define and implement the PumpfunInstructionProcessor
that will handle the decoded Pump.fun instructions.
3. Implement the Instruction Processor
Below your main
function, let's implement the PumpfunInstructionProcessor
that will process the decoded Pump.fun instructions:
pub struct PumpfunInstructionProcessor;
#[async_trait]
impl Processor for PumpfunInstructionProcessor {
type InputType = (
InstructionMetadata,
DecodedInstruction<PumpfunInstruction>,
NestedInstructions,
);
async fn process(
&mut self,
(metadata, instruction, _nested_instructions): Self::InputType,
_metrics: Arc<MetricsCollection>,
) -> CarbonResult<()> {
let signature = metadata.transaction_metadata.signature;
let accounts = instruction.accounts;
match instruction.data {
PumpfunInstruction::Sell(_params) => {
log::info!("❌ - EXPECT TO NEVER SEE SELL INSTRUCTION");
log::info!("TxId: {}", signature);
}
PumpfunInstruction::Create(params) => {
log::info!("💊 NEW PUMPFUN TOKEN CREATED");
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
log::info!("TxId: {}", signature);
log::info!("Token Details:");
log::info!(" • Name: {}", params.name);
log::info!(" • Symbol: {}", params.symbol);
match Create::arrange_accounts(&accounts) {
Some(arranged_accounts) => {
log::info!(" • Mint: {}", arranged_accounts.mint);
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
}
None => {
log::warn!(
"Failed to arrange accounts for Create instruction. Create instruction: signature: {signature}"
);
}
}
}
PumpfunInstruction::Migrate(_params) => {
log::info!("🔄 MIGRATE INSTRUCTION DETECTED");
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
log::info!("TxId: {}", signature);
match Migrate::arrange_accounts(&accounts) {
Some(arranged_accounts) => {
log::info!(" • Mint: {}", arranged_accounts.mint);
log::info!(
"━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
);
}
None => {
log::warn!(
"Failed to arrange accounts for Migrate instruction. Migrate instruction: signature: {signature}"
);
}
}
}
_ => {
// Ignore non-target instructions
}
}
Ok(())
}
}
Let's break down this instruction processor:
- Struct and Trait Implementation - We define a
PumpfunInstructionProcessor
struct that implements the CarbonProcessor
trait, which is responsible for handling decoded instructions. From the Carbon documentation, "TheProcessor
trait provides a single asynchronous method,process
, which is responsible for handling data of a specific type (InputType
). This trait is designed to be implemented by types that need to process data within the pipeline, allowing for customized handling of different data types.":
#[async_trait]
pub trait Processor {
type InputType;
async fn process(
&mut self,
data: Self::InputType,
metrics: Arc<MetricsCollection>,
) -> CarbonResult<()>;
}
-
Input Type Definition - The
InputType
is a tuple containing three elements:InstructionMetadata
: Contains transaction context informationDecodedInstruction<PumpfunInstruction>
: The decoded Pump.fun instructionNestedInstructions
: Any nested inner instructions, like CPI calls
-
Process Method - This is where the actual processing logic happens:
- We extract the transaction signature and account list
- We match against different instruction types to handle each one appropriately
-
Instruction Handlers:
-
Sell Instruction - This is included for demonstration purposes only to help illutrate how our
account_include
filter is working. Our transaction filters specify that the target transactions must include the mint or migration authority accounts--neither of which are present in sell instructions. Therefore, this code path should never execute. The log message "EXPECT TO NEVER SEE SELL INSTRUCTION" reinforces this. If you wanted to, you could remove this code entirely -- or modify the filters to include sell instructions if you wanted to monitor them as well. -
Create Instruction - When a new Pump.fun token is created:
- We log the transaction and format it for readability
- We extract token details from the params like
name
andsymbol
- We use
arrange_accounts
to get structured access to the accounts involved, so that we can log themint
address
-
Migrate Instruction - When a token migration occurs, we log similar information as with Create
-
Other Instructions - Any other Pump.fun instructions are ignored
-
-
Account Arrangement - For both Create and Migrate instructions, we use the
arrange_accounts
helper method. This Carbon function:- Takes the raw list of accounts
- Returns them in a structured object that makes it easy to access specific accounts by name
- Returns
None
if the accounts don't match the expected pattern
This processor design demonstrates a key strength of Carbon - the ability to work with decoded instruction data in a strongly typed way, rather than dealing with raw bytes. The PumpfunDecoder
(which is provided by the carbon-pumpfun-decoder
crate) handles the complex task of deserializing raw instruction data, allowing our processor to focus on business logic. You can use your IDE's intelisense features to explore the PumpfunInstruction
enum and see all the possible instructions that can be decoded.
Running the Application
To run your Pump.fun transaction monitor:
- Make sure your
.env
file is set up with valid Yellowstone gRPC credentials - Build the application:
cargo build
- Run the application:
cargo run
You should see output like:
[INFO] Starting Pumpfun Transaction Processor Using Carbon
[INFO] Using GEYSER_URL: your-geyser-url
[INFO] X_TOKEN is set
When a new Pump.fun token is created, you'll see output like:
[INFO] 💊 NEW PUMPFUN TOKEN CREATED
[INFO] ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
[INFO] TxId: 5QrTR9...truncated...
[INFO] Token Details:
[INFO] • Name: Awesome Token
[INFO] • Symbol: AWSM
[INFO] • Mint: AWsM4rKh...truncated...
[INFO] ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Great job!
Expanding Your Monitor
Here are some ways you might expand this basic monitor:
- Database Integration: Store transaction data in a database for analytics
- Expand Filters: Add more filters to target additional instruction types
- Multiple Program Monitoring: Define and
insert
additional transaction filters to monitor multiple programs and utilize multiple decoders - Automate Trading: Leverage the Metis Swap API with Pump.fun support to build and implement automated trading strategies
Wrap Up
In this guide, we've built a Pump.fun transaction monitor using Carbon and Yellowstone gRPC. This approach provides high-performance, real-time monitoring with minimal code. The Carbon framework abstracts away much of the complexity of parsing Solana transaction and account data, allowing you to focus on your application logic.
If you have any questions or need help with your Solana development projects, join us in the QuickNode Discord or follow us on Twitter.
We ❤️ Feedback!
Let us know if you have any feedback or requests for new topics. We'd love to hear from you.