-
Notifications
You must be signed in to change notification settings - Fork 71
feat: compressible
proc macro draft
#1857
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThis update introduces a comprehensive overhaul of the compressible account SDK and test infrastructure. It adds new SDK modules for compressible account management, configuration, compression info, and idempotent decompression, implements procedural macros to automate instruction generation, and provides new Anchor and native Solana test programs. The changes expand workspace membership, update test workflows, and add extensive integration and unit tests for compressible account flows and configuration management. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Program
participant LightSystem
participant ConfigPDA
Client->>Program: create_record (with compression params)
Program->>ConfigPDA: Load compression config
Program->>LightSystem: CPI compress_account_on_init (proof, tree info)
LightSystem-->>Program: Success
Program-->>Client: Transaction success
Client->>Program: decompress_accounts_idempotent (multiple PDAs)
Program->>ConfigPDA: Load compression config
Program->>LightSystem: CPI decompress_accounts_idempotent (proof, compressed accounts)
LightSystem-->>Program: Decompression result
Program-->>Client: Transaction result
Estimated code review effort🎯 5 (Critical) | ⏱️ ~90+ minutes Possibly related PRs
Poem
Note ⚡️ Unit Test Generation is now available in beta!Learn more here, or try it out under "Finishing Touches" below. ✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
if rent_recipient.key != &crate::create_dynamic_pda::RENT_RECIPIENT { | ||
return Err(LightSdkError::ConstraintViolation); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to check the recipient if it is part of the account.
We can trust data from Solana and compressed accounts.
for params in &new_address_params { | ||
let address_tree_account = cpi_accounts | ||
.get_tree_account_info(params.address_merkle_tree_account_index as usize)?; | ||
if address_tree_account.pubkey() != *expected_address_space { | ||
msg!( | ||
"Invalid address space. Expected: {}. Found: {}.", | ||
expected_address_space, | ||
address_tree_account.pubkey() | ||
); | ||
return Err(LightSdkError::ConstraintViolation); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of checking it we could also hardcode it as a constant
if pda_accounts.len() != addresses.len() | ||
|| pda_accounts.len() != new_address_params.len() | ||
|| pda_accounts.len() != output_state_tree_indices.len() | ||
{ | ||
return Err(LightSdkError::ConstraintViolation); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's probably more dev friendly to bundle these into a struct so that it's not possible to create len mismatches.
struct CompressingPda {
account_info: AccountInfo,
address_params: PackedNewAddressParams,
out_tree_index: u8,
}
proof: ValidityProof, | ||
cpi_accounts: CpiAccounts<'_, 'info>, | ||
owner_program: &Pubkey, | ||
rent_recipient: &AccountInfo<'info>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's probably ok if its just a Pubkey. With Solana and anchor entrypoint, it safes CU to pass fewer AccountInfos and use instruction data instead. We can zero copy instruction data but not AccountInfos.
@@ -11,7 +11,7 @@ proc-macro2 = { workspace = true } | |||
quote = { workspace = true } | |||
syn = { workspace = true } | |||
solana-pubkey = { workspace = true, features = ["curve25519", "sha2"] } | |||
|
|||
heck = "0.4.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what da heck
pub last_written_slot: u64, | ||
pub compression_delay: u64, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should provide a header struct.
struct Header {
pub discriminator: [u8;8], // debatable
pub rent_recipient: [u8;32],
pub last_written_slot: u64,
pub compression_delay: u64, // can make this a u32 because we can just add it on `last_written_slot`
pub updatable: bool,
pub compress_sol: bool,
pub version: Enum(u16)
_padding: [u8;4], // for memory alignment so that we can do a zero copy
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the header should not be part of the compressed account, and should not derive LightHasher
.
a62316d
to
054bd7c
Compare
pda_account.owner | ||
); | ||
return Err(LightSdkError::ConstraintViolation); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can use the light-account-check crate instead of manual checks
let dest_starting_lamports = rent_recipient.lamports(); | ||
**rent_recipient.try_borrow_mut_lamports()? = dest_starting_lamports | ||
.checked_add(pda_lamports) | ||
.ok_or(ProgramError::ArithmeticOverflow)?; | ||
// 2. Decrement source account lamports | ||
**pda_account.try_borrow_mut_lamports()? = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will need this logic as well in the ctoken program we should create a shared function which closes an account this way
080a75d
to
d4dd167
Compare
sdk-libs/client/src/compressible.rs
Outdated
#[cfg(feature = "anchor")] | ||
use anchor_lang::{AnchorDeserialize, AnchorSerialize}; | ||
#[cfg(not(feature = "anchor"))] | ||
use borsh::{BorshDeserialize as AnchorDeserialize, BorshSerialize as AnchorSerialize}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should not be in light-client
, it should either be in light-sdk
since this only creates instructions or be it's own crate.
@@ -75,7 +75,7 @@ impl LightProgramTest { | |||
if context.config.skip_startup_logs { | |||
context.config.no_logs = true; | |||
} | |||
initialize_accounts(&mut context, &config, &keypairs).await?; | |||
context.test_accounts = initialize_accounts(&mut context, &config, &keypairs).await?; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the accounts are already set before with get_program_test_test_accounts
we should make sure that those contain the new accounts as well.
} | ||
} | ||
|
||
pub fn for_regenerate_accounts() -> TestKeypairs { | ||
// Note: this requries your machine to have the light-keypairs dir with the correct keypairs. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should change that, the same way I am changing the address for the address Merkle tree in light-program-test setup.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideas to split this up:
- adding the second batched Merkle tree could easily be a separate pr
rent_recipient.key, | ||
config.rent_recipient | ||
); | ||
return Err(LightSdkError::ConstraintViolation); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more specific errors make debugging easier.
let authority = &accounts[1]; | ||
|
||
// Verify the config PDA | ||
let (expected_pda, _) = CompressibleConfig::derive_pda(&crate::ID); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we only allow one config per program, what if we want different configs for different account types?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should store a vector of configs.
sdk-libs/sdk/Cargo.toml
Outdated
|
||
anchor-lang = { workspace = true } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
anchor-lang = { workspace = true } | |
anchor-lang = { workspace = true, optional = true } |
pub fn prepare_accounts_for_decompress_idempotent<'info, T>( | ||
pda_accounts: &[&AccountInfo<'info>], | ||
compressed_accounts: Vec<LightAccount<'_, T>>, | ||
signer_seeds: &[&[&[u8]]], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we need to use CPI_SIGNER_SEEDS right?
(If so hardcoding will save a lot of pain.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
those are just the signer_seeds for the create_account cpi to the solana systemprogram
our light cpi signer seeds are via invoke_light_system_program
Main questions:
|
here #1857 (comment) (we can rename edit: renamed
this or we could add a u8 config_index to the derivation path and have 1 account per config - what's better? |
.github/workflows/sdk-tests.yml
Outdated
name: sdk-tests | ||
if: github.event.pull_request.draft == false | ||
runs-on: warp-ubuntu-latest-x64-4x | ||
timeout-minutes: 60 | ||
|
||
services: | ||
redis: | ||
image: redis:8.0.1 | ||
ports: | ||
- 6379:6379 | ||
options: >- | ||
--health-cmd "redis-cli ping" | ||
--health-interval 10s | ||
--health-timeout 5s | ||
--health-retries 5 | ||
|
||
env: | ||
REDIS_URL: redis://localhost:6379 | ||
RUST_MIN_STACK: 8388608 | ||
|
||
steps: | ||
- name: Checkout sources | ||
uses: actions/checkout@v4 | ||
|
||
- name: Setup and build | ||
uses: ./.github/actions/setup-and-build | ||
with: | ||
skip-components: "redis" | ||
|
||
- name: Build CLI | ||
run: | | ||
source ./scripts/devenv.sh | ||
npx nx build @lightprotocol/zk-compression-cli | ||
|
||
- name: Build core programs | ||
run: | | ||
source ./scripts/devenv.sh | ||
npx nx build @lightprotocol/programs | ||
|
||
- name: Build and test all sdk-tests programs | ||
run: | | ||
source ./scripts/devenv.sh | ||
# Increase stack size for SBF compilation to avoid regex_automata stack overflow | ||
export RUST_MIN_STACK=16777216 | ||
# Remove -D warnings flag for SBF compilation to avoid compilation issues | ||
export RUSTFLAGS="" | ||
|
||
echo "Building and testing all sdk-tests programs sequentially..." | ||
# Build and test each program one by one to ensure .so files exist | ||
|
||
echo "Building and testing native-compressible (sdk-test)..." | ||
cargo-test-sbf -p sdk-test | ||
|
||
echo "Building and testing anchor-compressible..." | ||
cargo-test-sbf -p anchor-compressible | ||
|
||
echo "Building and testing anchor-compressible-derived..." | ||
cargo-test-sbf -p anchor-compressible-derived |
Check warning
Code scanning / CodeQL
Workflow does not contain permissions Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 22 days ago
To fix the problem, you should add a permissions
block to the workflow or to the specific job(s) that require it. The minimal starting point is to set contents: read
, which restricts the GITHUB_TOKEN to read-only access to repository contents. If the job requires additional permissions (e.g., to create or update pull requests), you can add those as needed. In this case, since the workflow appears to be running tests and builds, contents: read
is likely sufficient. The change should be made at the job level (under jobs.sdk-tests
) or at the workflow root (top-level, before jobs:
). Since the CodeQL warning is on the job, adding it to the job is the most targeted fix.
You only need to add the following block under jobs.sdk-tests:
(after line 32, before runs-on
):
permissions:
contents: read
No additional imports or definitions are required.
-
Copy modified lines R33-R34
@@ -32,2 +32,4 @@ | ||
name: sdk-tests | ||
permissions: | ||
contents: read | ||
if: github.event.pull_request.draft == false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 26
🔭 Outside diff range comments (1)
sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs.bak (1)
1-1180
: Remove backup file from version control.Backup files (
.bak
) should not be committed to the repository. Version control already provides file history, making backup files redundant and potentially confusing.
♻️ Duplicate comments (4)
sdk-libs/sdk/Cargo.toml (1)
36-36
: Revert anchor-lang to optional dependency to align with past review feedback.This change contradicts the previous review comment suggesting
anchor-lang = { workspace = true, optional = true }
. Making anchor-lang mandatory forces all SDK users to include Anchor dependencies even when not using Anchor features.Apply this diff to revert to optional dependency:
-anchor-lang = { workspace = true } +anchor-lang = { workspace = true, optional = true }And ensure it's included in the "anchor" feature array on line 16-19.
.github/workflows/sdk-tests.yml (1)
30-89
: Add explicit permissions block for security.The workflow should limit GITHUB_TOKEN permissions as recommended by the static analysis tool. This follows security best practices.
jobs: sdk-tests: name: sdk-tests + permissions: + contents: read if: github.event.pull_request.draft == false runs-on: warp-ubuntu-latest-x64-4xsdk-libs/sdk/src/compressible/decompress_idempotent.rs (1)
42-42
: Use a slice instead of Vec for compressed_accounts parameterAs previously noted, we should avoid vectors in function signatures. Consider using
&[LightAccount<'_, T>]
instead ofVec<LightAccount<'_, T>>
to avoid unnecessary ownership transfer and improve performance.sdk-libs/sdk/src/compressible/config.rs (1)
20-35
: Move static fields before the vector field.As mentioned in the previous review, static values should come before vectors in the struct definition for better memory layout.
pub struct CompressibleConfig { /// Config version for future upgrades pub version: u8, /// Number of slots to wait before compression is allowed pub compression_delay: u32, /// Authority that can update the config pub update_authority: Pubkey, /// Account that receives rent from compressed PDAs pub rent_recipient: Pubkey, /// Config bump seed (for multiple configs per program) pub config_bump: u8, /// PDA bump seed pub bump: u8, /// Address space for compressed accounts (exactly 1 address_tree allowed) pub address_space: Vec<Pubkey>, }
🧹 Nitpick comments (26)
sdk-tests/anchor-compressible-derived/Xargo.toml (1)
1-2
: Consider addingpanic_immediate_abort
for smaller BPF binariesBuilding Solana BPF programs with
std
enabled often inflates binary size.
The common pattern is to compilestd
in abort mode:[target.bpfel-unknown-unknown.dependencies.std] features = ["panic_immediate_abort"]Adding the feature strips unwinding and usually shaves several KB off the ELF, helping stay below the 200 KiB limit.
If size is already comfortably under the cap, feel free to ignore.sdk-libs/macros/src/cpi_signer.rs (1)
5-6
: LGTM! Reasonable temporary suppression during refactoring.The
#[allow(dead_code)]
attribute and TODO comment appropriately handle what appears to be legacy code during the macro refactoring process.Consider creating an issue to track the TODO and determine if this function can be removed or if it's still needed somewhere in the codebase.
cli/accounts/cpi_context_batched_2_HwtjxDvFEXiWnzeMeWkMBzpQN45A95rTJNZmz1Z3pe8R.json (1)
1-14
: Consider adding a trailing newline.The JSON structure is correct and serves its purpose as test account data. However, the file should end with a newline character for better consistency with standard text file conventions.
Add a newline at the end of the file:
} +
sdk-tests/anchor-compressible/test_discriminator.rs (1)
4-9
: Consider converting to an automated test with assertions.While this validation script serves its purpose, it would be more robust as an automated test that actually fails when the discriminator doesn't match the expected value.
Consider this approach for better test automation:
fn main() { let discriminator = CompressibleConfig::LIGHT_DISCRIMINATOR; - println!("Expected discriminator: {:?}", discriminator); - println!("Account data discriminator: {:?}", [55, 213, 41, 112, 43, 227, 172, 215]); - println!("Match: {}", discriminator == [55, 213, 41, 112, 43, 227, 172, 215]); + let expected = [55, 213, 41, 112, 43, 227, 172, 215]; + + assert_eq!( + discriminator, + expected, + "Discriminator mismatch: got {:?}, expected {:?}", + discriminator, + expected + ); + + println!("✅ Discriminator validation passed!"); }Alternatively, convert this to a proper unit test using
#[test]
annotation.program-tests/utils/src/test_keypairs.rs (1)
75-75
: Fix typo in comment.- // Note: this requries your machine to have the light-keypairs dir with the correct keypairs. + // Note: this requires your machine to have the light-keypairs dir with the correct keypairs.sdk-libs/program-test/src/accounts/test_accounts.rs (1)
92-92
: Address the TODO comment for CPI context replacement.The TODO comment indicates that the hardcoded CPI context pubkey needs to be replaced. This should be addressed before merging.
Would you like me to generate a proper pubkey for this CPI context or help create an issue to track this task?
sdk-libs/sdk/src/compressible/decompress_idempotent.rs (1)
74-74
: Remove debug print statementDebug
msg!
statements should be removed from production code or replaced with appropriate logging mechanisms.- msg!("solana_account: {:?}", solana_account);
sdk-tests/native-compressible/Cargo.toml (1)
2-2
: Consider using a more specific package nameThe package name "sdk-test" is too generic and doesn't indicate that this is specifically for native compressible tests. Consider renaming to something more descriptive like "native-compressible-test" or "sdk-native-compressible-test" to avoid confusion with other test packages.
-name = "sdk-test" +name = "native-compressible-test"sdk-tests/anchor-compressible/tests/test_config.rs (1)
484-484
: Consider using structured approach for field offsetsThe test uses hardcoded byte offsets to corrupt specific fields, which is fragile and will break if the account structure changes. While the comment shows the calculation, consider using a more maintainable approach.
Consider defining constants or using a helper function to calculate offsets:
// Define offsets as constants based on the structure const DISCRIMINATOR_SIZE: usize = 8; const VERSION_OFFSET: usize = DISCRIMINATOR_SIZE; const VERSION_SIZE: usize = 1; const COMPRESSION_DELAY_OFFSET: usize = VERSION_OFFSET + VERSION_SIZE; const COMPRESSION_DELAY_SIZE: usize = 4; // ... and so on // Or create a helper to calculate offsets fn get_field_offset(field: ConfigField) -> usize { match field { ConfigField::Version => DISCRIMINATOR_SIZE, ConfigField::CompressionDelay => DISCRIMINATOR_SIZE + 1, // ... } }Also applies to: 540-544
sdk-libs/program-test/src/program_test/compressible_setup.rs (2)
91-95
: Improve input validation with descriptive error message.The empty address space validation is good, but the error message could be more specific about the requirement.
- return Err(RpcError::CustomError( - "At least one address space must be provided".to_string(), - )); + return Err(RpcError::CustomError( + "Address space cannot be empty: at least one Merkle tree must be specified for compression configuration".to_string(), + ));
109-114
: Optimize signer deduplication logic.The current signer deduplication logic works but could be more efficient and readable.
- let signers = if payer.pubkey() == authority.pubkey() { - vec![payer] - } else { - vec![payer, authority] - }; + let mut signers = vec![payer]; + if payer.pubkey() != authority.pubkey() { + signers.push(authority); + }This approach scales better if more conditional signers are added in the future.
sdk-tests/native-compressible/src/create_config.rs (1)
38-38
: Document the hardcoded bump value.The hardcoded bump value of 0 has a comment but could benefit from more detailed documentation about the design decision.
- 0, // one global config for now, so bump is 0. + 0, // Global config uses bump 0 - only one config per program supportedsdk-libs/sdk/src/compressible/compression_info.rs (1)
88-91
: Clarify the Space implementation comment.The comment could be more precise about what's being counted.
#[cfg(feature = "anchor")] impl anchor_lang::Space for CompressionInfo { - const INIT_SPACE: usize = 8 + 1; // u64 + enum (1 byte for discriminant) + const INIT_SPACE: usize = 8 + 1; // u64 (last_written_slot) + enum discriminant (1 byte) }sdk-libs/macros/src/compressible.rs (1)
380-386
: Update comment to reflect actual customization needs.The comment about seeds being customizable doesn't match the hardcoded example.
#[account( mut, - seeds = [b"user_record", user.key().as_ref()], // This should be customizable + seeds = [b"user_record", user.key().as_ref()], // TODO: Seeds should be customized per implementation bump, // Add your custom constraints here )]sdk-libs/sdk/src/compressible/compress_account.rs (1)
38-92
: Consider extracting common compression logic to reduce duplication.Both
compress_account
andcompress_pda_native
share nearly identical logic for compression delay validation, compression info handling, and CPI invocation. This duplication increases maintenance burden and risk of inconsistencies.Consider extracting the common logic into a shared helper function:
fn validate_and_prepare_compression<A>( compression_info: &CompressionInfo, compression_delay: &u32, ) -> Result<(), crate::ProgramError> where A: HasCompressionInfo, { let current_slot = Clock::get()?.slot; let last_written_slot = compression_info.last_written_slot(); if current_slot < last_written_slot + *compression_delay as u64 { msg!( "Cannot compress yet. {} slots remaining", (last_written_slot + *compression_delay as u64).saturating_sub(current_slot) ); return Err(LightSdkError::ConstraintViolation.into()); } Ok(()) }Also applies to: 116-165
sdk-libs/macros/src/EXAMPLE_USAGE.md (3)
186-193
: Add language specifier to fenced code block for proper syntax highlighting.-``` +```text 0. [writable, signer] Payer account 1. [writable] Config PDA (seeds: [b"compressible_config"]) 2. [] Program data account 3. [signer] Program upgrade authority 4. [] System program -``` +```
196-203
: Add language specifier to fenced code block for proper syntax highlighting.-``` +```text 0. [signer] Authority 1. [writable] PDA account to compress 2. [] System program 3. [] Config PDA 4. [] Rent recipient (must match config) 5... [] Light Protocol system accounts -``` +```
207-213
: Add language specifier to fenced code block for proper syntax highlighting.-``` +```text 0. [writable, signer] Fee payer 1. [writable, signer] Rent payer 2. [] System program 3..N. [writable] PDA accounts to decompress into N+1... [] Light Protocol system accounts -``` +```sdk-tests/native-compressible/tests/test_compressible_flow.rs (1)
92-93
: Uncomment or remove the data length assertion.The commented assertion for data length should either be uncommented if it's a valid check, or removed entirely with an explanation.
assert!(account.unwrap().lamports > 0); - // assert_eq!(account.unwrap().data.len(), 31); + // Verify account data size includes discriminator + data + assert_eq!(account.unwrap().data.len(), 8 + 31 + std::mem::size_of::<CompressionInfo>());sdk-libs/sdk/src/compressible/config.rs (1)
426-432
: Use standard Pubkey formatting instead of .log() method.The
.log()
method is not a standard Pubkey method. Use standard formatting for consistency.msg!( - "create_compression_config_checked program_data_account: {:?}", - program_data_account.key.log() + "create_compression_config_checked program_data_account: {}", + program_data_account.key ); msg!( - "create_compression_config_checked program_id: {:?}", - program_id.log() + "create_compression_config_checked program_id: {}", + program_id );sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (2)
30-143
: Add descriptive comments for test scenariosThe test covers multiple scenarios but would benefit from comments explaining each phase:
- Individual account creation and compression
- Slot warping for decompression delay
- Combined account creation in single transaction
Consider adding section comments to improve readability.
282-284
: Remove redundant CU simulationThe compute unit simulation is performed twice - once before and once in the transaction creation. The first simulation result is not used.
Remove the redundant simulation:
- let cu = simulate_cu(rpc, payer, &instruction).await; - println!("CreateRecord CU consumed: {}", cu); - // Create and send transactionsdk-tests/anchor-compressible-derived/src/lib.rs (3)
36-37
: Improve error handling specificityThe generic
AccountDidNotDeserialize
error doesn't accurately represent a config loading failure.Consider using a more specific error or creating a custom error:
- let config = CompressibleConfig::load_checked(&ctx.accounts.config, &crate::ID) - .map_err(|_| anchor_lang::error::ErrorCode::AccountDidNotDeserialize)?; + let config = CompressibleConfig::load_checked(&ctx.accounts.config, &crate::ID)?;
91-184
: LGTM! Efficient batched account creationExcellent implementation that efficiently creates and compresses multiple accounts in a single CPI call. The use of
prepare_accounts_for_compression_on_init
for each account type followed by a combined CPI invocation is the correct pattern.Consider adding a comment explaining the batching benefit for future maintainers.
231-252
: Consider using InitSpace for automatic space calculationThe manual space calculations are error-prone and need to be updated when fields change. Since the account structs already derive
InitSpace
, consider using it for automatic calculation.Example approach:
space = UserRecord::INIT_SPACEThis would eliminate the need for manual calculations and reduce maintenance burden.
sdk-libs/light-compressible-client/src/lib.rs (1)
219-234
: Extract complex tree info matching logicThe nested find operation with multiple conditions is complex and appears twice in the codebase (here and in decompress_accounts_idempotent). Consider extracting this into a helper method.
fn find_packed_tree_info( packed_tree_infos: &PackedTreeInfos, queue_index: u8, leaf_index: u32, ) -> Option<PackedStateTreeInfo> { packed_tree_infos .state_trees .as_ref()? .packed_tree_infos .iter() .find(|pti| { pti.queue_pubkey_index == queue_index && pti.leaf_index == leaf_index }) .copied() }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
Cargo.lock
is excluded by!**/*.lock
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (73)
.github/workflows/light-examples-tests.yml
(2 hunks).github/workflows/sdk-tests.yml
(1 hunks)Cargo.toml
(4 hunks)cli/accounts/batch_state_merkle_tree_2_2Yb3fGo2E9aWLjY8KuESaqurYpGGhEeJr7eynKrSgXwS.json
(1 hunks)cli/accounts/batched_output_queue_2_12wJT3xYd46rtjeqDU6CrtT8unqLjPiheggzqhN9YsyB.json
(1 hunks)cli/accounts/cpi_context_batched_2_HwtjxDvFEXiWnzeMeWkMBzpQN45A95rTJNZmz1Z3pe8R.json
(1 hunks)pnpm-workspace.yaml
(1 hunks)program-tests/sdk-test/src/lib.rs
(0 hunks)program-tests/sdk-test/tests/test.rs
(0 hunks)program-tests/utils/src/test_keypairs.rs
(2 hunks)sdk-libs/client/Cargo.toml
(2 hunks)sdk-libs/client/src/indexer/tree_info.rs
(1 hunks)sdk-libs/client/src/lib.rs
(1 hunks)sdk-libs/client/src/rpc/client.rs
(1 hunks)sdk-libs/light-compressible-client/Cargo.toml
(1 hunks)sdk-libs/light-compressible-client/src/lib.rs
(1 hunks)sdk-libs/macros/CHANGELOG.md
(1 hunks)sdk-libs/macros/Cargo.toml
(1 hunks)sdk-libs/macros/src/EXAMPLE_USAGE.md
(1 hunks)sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-libs/macros/src/cpi_signer.rs
(1 hunks)sdk-libs/macros/src/lib.rs
(2 hunks)sdk-libs/macros/src/native_compressible.rs
(1 hunks)sdk-libs/program-test/Cargo.toml
(1 hunks)sdk-libs/program-test/src/accounts/initialize.rs
(2 hunks)sdk-libs/program-test/src/accounts/test_accounts.rs
(2 hunks)sdk-libs/program-test/src/accounts/test_keypairs.rs
(3 hunks)sdk-libs/program-test/src/indexer/test_indexer.rs
(2 hunks)sdk-libs/program-test/src/lib.rs
(1 hunks)sdk-libs/program-test/src/program_test/compressible_setup.rs
(1 hunks)sdk-libs/program-test/src/program_test/light_program_test.rs
(2 hunks)sdk-libs/program-test/src/program_test/mod.rs
(2 hunks)sdk-libs/program-test/src/utils/mod.rs
(1 hunks)sdk-libs/program-test/src/utils/simulation.rs
(1 hunks)sdk-libs/sdk-types/src/constants.rs
(1 hunks)sdk-libs/sdk/Cargo.toml
(1 hunks)sdk-libs/sdk/src/account.rs
(7 hunks)sdk-libs/sdk/src/compressible/compress_account.rs
(1 hunks)sdk-libs/sdk/src/compressible/compress_account_on_init.rs
(1 hunks)sdk-libs/sdk/src/compressible/compression_info.rs
(1 hunks)sdk-libs/sdk/src/compressible/config.rs
(1 hunks)sdk-libs/sdk/src/compressible/decompress_idempotent.rs
(1 hunks)sdk-libs/sdk/src/compressible/mod.rs
(1 hunks)sdk-libs/sdk/src/error.rs
(1 hunks)sdk-libs/sdk/src/lib.rs
(2 hunks)sdk-tests/anchor-compressible-derived/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible-derived/README.md
(1 hunks)sdk-tests/anchor-compressible-derived/Xargo.toml
(1 hunks)sdk-tests/anchor-compressible-derived/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs.bak
(1 hunks)sdk-tests/anchor-compressible/CONFIG.md
(1 hunks)sdk-tests/anchor-compressible/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible/Xargo.toml
(1 hunks)sdk-tests/anchor-compressible/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible/test_discriminator.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_config.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_instruction_builders.rs
(1 hunks)sdk-tests/native-compressible/Cargo.toml
(1 hunks)sdk-tests/native-compressible/src/compress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/create_config.rs
(1 hunks)sdk-tests/native-compressible/src/create_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/create_pda.rs
(2 hunks)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/lib.rs
(1 hunks)sdk-tests/native-compressible/src/update_config.rs
(1 hunks)sdk-tests/native-compressible/tests/test_compressible_flow.rs
(1 hunks)sdk-tests/native-compressible/tests/test_config.rs
(1 hunks)sdk-tests/package.json
(1 hunks)xtask/Cargo.toml
(1 hunks)xtask/src/create_batch_state_tree.rs
(1 hunks)xtask/src/new_deployment.rs
(1 hunks)
💤 Files with no reviewable changes (2)
- program-tests/sdk-test/src/lib.rs
- program-tests/sdk-test/tests/test.rs
🧰 Additional context used
🧬 Code Graph Analysis (5)
sdk-tests/native-compressible/src/create_pda.rs (1)
sdk-tests/native-compressible/src/decompress_dynamic_pda.rs (1)
size
(218-221)
sdk-libs/program-test/src/lib.rs (2)
sdk-tests/anchor-compressible/src/lib.rs (2)
initialize_compression_config
(83-103)update_compression_config
(106-124)sdk-libs/program-test/src/program_test/compressible_setup.rs (3)
initialize_compression_config
(80-117)setup_mock_program_data
(44-61)update_compression_config
(137-161)
sdk-tests/native-compressible/tests/test_config.rs (2)
sdk-libs/sdk/src/compressible/config.rs (2)
derive_pda
(61-63)derive_default_pda
(66-68)sdk-tests/anchor-compressible/tests/test_config.rs (1)
test_config_validation
(59-85)
sdk-libs/sdk/src/compressible/compression_info.rs (2)
sdk-tests/anchor-compressible/src/lib.rs (12)
compression_info
(608-613)compression_info
(668-672)compression_info
(719-723)compression_info_mut
(615-620)compression_info_mut
(674-678)compression_info_mut
(725-729)compression_info_mut_opt
(622-627)compression_info_mut_opt
(680-682)compression_info_mut_opt
(731-733)set_compression_info_none
(629-634)set_compression_info_none
(684-686)set_compression_info_none
(735-737)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs (4)
compression_info
(196-200)compression_info_mut
(202-206)compression_info_mut_opt
(208-210)set_compression_info_none
(212-214)
sdk-tests/anchor-compressible-derived/src/lib.rs (7)
sdk-libs/macros/src/compressible.rs (2)
compressible
(20-25)add_compressible_instructions
(41-453)sdk-tests/anchor-compressible/src/lib.rs (6)
compress_account_on_init
(56-56)compress_account_on_init
(278-278)prepare_accounts_for_compression_on_init
(340-340)prepare_accounts_for_compression_on_init
(357-357)create_record
(27-68)update_record
(70-80)sdk-libs/sdk/src/compressible/compress_account_on_init.rs (2)
compress_account_on_init
(26-67)prepare_accounts_for_compression_on_init
(92-169)js/stateless.js/src/state/compressed-account.ts (1)
PackedAddressTreeInfo
(146-159)js/stateless.js/src/state/types.ts (2)
ValidityProof
(397-410)CompressedAccountData
(199-212)sdk-libs/sdk/src/compressible/config.rs (1)
load_checked
(97-125)sdk-libs/sdk/src/compressible/compression_info.rs (1)
new_decompressed
(34-39)
🪛 GitHub Check: CodeQL
.github/workflows/sdk-tests.yml
[warning] 32-89: Workflow does not contain permissions
Actions job or workflow does not limit the permissions of the GITHUB_TOKEN. Consider setting an explicit permissions block, using the following as a minimal starting point: {{contents: read}}
🪛 markdownlint-cli2 (0.17.2)
sdk-libs/macros/src/EXAMPLE_USAGE.md
186-186: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
196-196: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
207-207: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (21)
- GitHub Check: stateless-js-v2
- GitHub Check: cli-v2
- GitHub Check: Test program-libs-slow
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test sdk-libs
- GitHub Check: Test program-libs-fast
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: sdk-tests
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: lint
- GitHub Check: stateless-js-v1
- GitHub Check: cli-v1
🔇 Additional comments (103)
xtask/Cargo.toml (1)
36-36
: Verify workspace-level declaration forbase64
before merging
base64 = { workspace = true }
assumes the root workspace already defines the exact version ofbase64
.
If that entry is missing or mismatched,cargo
will error with “packagebase64
not found in workspace”.
- Confirm there is a
[workspace.dependencies.base64]
stanza in the rootCargo.toml
.- If the crate is only needed by
xtask
, consider pinning the version locally instead of elevating it to the whole workspace to avoid unnecessary compilation for all members.Example local pin:
-base64 = { workspace = true } +base64 = "0.22" # or whatever version you intend to use(or keep the workspace approach if other crates will share it).
sdk-libs/program-test/src/utils/mod.rs (1)
6-6
: Module exposure looks goodThe new
pub mod simulation;
cleanly plugs the helper into the existing util façade and keeps namespacing consistent with sibling modules.
No further issues spotted.sdk-libs/client/src/lib.rs (1)
84-84
: Double-check thatclient::utils
is actually present
pub mod utils;
will fail to compile ifsrc/utils/mod.rs
(orutils.rs
) hasn’t been added in the same PR.
Please ensure the module file exists and is committed; otherwise CI will break.sdk-libs/program-test/Cargo.toml (1)
23-23
: Dependency addition acknowledged
light-compressible-client
is now pulled in with theanchor
feature, matching the rest of the crate’s Anchor-enabled stack.
No immediate concerns—build graph remains acyclic and feature flags align.sdk-libs/macros/Cargo.toml (1)
14-14
: LGTM! Standard dependency for macro string manipulation.The
heck
crate is commonly used in procedural macros for identifier case conversions, which aligns with the compressible macro functionality being introduced.sdk-tests/anchor-compressible/Xargo.toml (1)
1-2
: LGTM! Standard Xargo configuration for Solana BPF programs.The configuration correctly targets
bpfel-unknown-unknown
with minimalstd
features, which is appropriate for Solana program compilation.pnpm-workspace.yaml (1)
13-13
: LGTM! Necessary workspace configuration for new SDK test packages.The addition properly includes the new
sdk-tests
directory in the pnpm workspace, enabling management of the test packages introduced in this PR.sdk-tests/native-compressible/src/create_pda.rs (2)
3-3
: LGTM! Proper import for Size trait.The Size trait import is necessary for the trait implementation added below.
79-83
: LGTM! Correct Size trait implementation.The implementation correctly returns 31 bytes, matching the size of the
data: [u8; 31]
field in the struct. This follows the same pattern seen in other parts of the codebase and properly supports compressed account size calculations.sdk-libs/client/Cargo.toml (1)
38-38
: LGTM!The optional
anchor-lang
dependency withidl-build
feature appropriately supports the new compressible account functionality introduced in this PR. The optional nature ensures it's only included when needed.sdk-libs/sdk-types/src/constants.rs (1)
42-43
: LGTM!The
DEFAULT_DATA_HASH
constant is well-defined with a clear purpose for representing empty data in compression contexts. The choice of[1; 32]
(all ones) is intentional and distinguishable from common default patterns like all zeros.sdk-libs/program-test/src/program_test/mod.rs (2)
1-1
: LGTM!The new
compressible_setup
module is properly declared and follows standard Rust module organization patterns.
11-11
: LGTM!The re-export of
compressible_setup
items is appropriately placed with other module re-exports, making the helper functions easily accessible to users of the program test framework.sdk-libs/sdk/Cargo.toml (1)
30-33
: LGTM! New Solana system dependencies support expanded functionality.The addition of
solana-system-interface
,solana-clock
,solana-sysvar
,solana-rent
, andarrayvec
dependencies aligns well with the expanded compression and system integration features described in the PR summary.Also applies to: 42-42
xtask/src/create_batch_state_tree.rs (1)
81-86
: LGTM! Improved file path organization.The change from using pubkey directly as filename to prepending directory prefixes (
"./target/mt-"
,"./target/nfq-"
,"./target/cpi-"
) creates better file organization and aligns with the expanded keypair management seen throughout the PR.xtask/src/new_deployment.rs (1)
313-315
: LGTM! New keypairs support expanded v2 state tree infrastructure.The addition of
batched_state_merkle_tree_2
,batched_output_queue_2
, andbatched_cpi_context_2
keypairs aligns with the broader expansion of the test infrastructure to support multiple v2 state trees, as evidenced by similar changes throughout the codebase.sdk-libs/program-test/src/program_test/light_program_test.rs (2)
78-78
: Verify this assignment doesn't conflict with existing account setup.The change to assign
initialize_accounts
result tocontext.test_accounts
is logical, but a past review comment mentioned "the accounts are already set before withget_program_test_test_accounts
we should make sure that those contain the new accounts as well."Please verify that this assignment properly integrates with the existing account setup and doesn't cause conflicts with accounts already set by
get_program_test_test_accounts()
.
128-129
: LGTM! Cleaner prover configuration handling.The simplification of prover config handling by removing explicit referencing and cloning makes the code cleaner and more idiomatic.
sdk-libs/sdk/src/error.rs (1)
95-101
: LGTM! Proper Anchor error integration.The new
From
implementation enables seamless error interoperability between the Light SDK and Anchor programs. The feature-gated approach and conversion path throughProgramError::Custom
follows Anchor conventions correctly.sdk-libs/program-test/src/lib.rs (1)
124-127
: LGTM! Clean API expansion for compression testing.The new re-exports provide convenient access to compression configuration utilities, making the testing API more ergonomic while maintaining clean module organization.
sdk-tests/package.json (1)
1-18
: LGTM! Well-structured test package configuration.The package.json is appropriately configured for SDK test programs with clear build scripts and nx integration. The documentation about using
test-sbf
instead ofbuild-sbf -- -p
to avoid infinite loops is helpful context.sdk-libs/program-test/src/indexer/test_indexer.rs (2)
90-91
: LGTM! Added support for second batched output queue test keypair.The import correctly adds the second test keypair needed for expanded v2 state tree support.
290-295
: LGTM! Extended StateV2 recognition for multiple test keypairs.The condition properly recognizes both test keypairs for batched output queues, enabling support for multiple v2 state merkle trees in the test environment. This is a clean extension of the existing logic.
sdk-libs/client/src/rpc/client.rs (1)
693-721
: LGTM! Clean implementation of multi-tree v2 support.The conditional compilation correctly handles both v1 and v2 scenarios. The hardcoded pubkeys are appropriate for localnet testing, and the dual TreeInfo structure properly supports the new multi-tree v2 architecture.
cli/accounts/batch_state_merkle_tree_2_2Yb3fGo2E9aWLjY8KuESaqurYpGGhEeJr7eynKrSgXwS.json (1)
1-1
: Valid test fixture for v2 state merkle tree.This JSON file provides the serialized account state for the second v2 batch state merkle tree. The pubkey matches the configuration used in the client RPC layer, ensuring consistency across the test infrastructure.
cli/accounts/batched_output_queue_2_12wJT3xYd46rtjeqDU6CrtT8unqLjPiheggzqhN9YsyB.json (1)
1-1
: Valid test fixture for v2 batched output queue.This JSON file provides the serialized account state for the second v2 batched output queue. The pubkey is consistent with the mapping configurations, supporting the expanded multi-tree v2 infrastructure.
sdk-libs/client/src/indexer/tree_info.rs (1)
295-317
: Properly extends tree mapping for second v2 tree.The new entries correctly follow the existing pattern with bidirectional mapping for both the queue and tree pubkeys. The TreeType::StateV2 designation and pubkey consistency across the codebase ensures proper integration with the expanded v2 infrastructure.
program-tests/utils/src/test_keypairs.rs (2)
67-70
: LGTM! Proper extension of test keypairs for v2 infrastructure.The new keypair fields correctly support the second v2 state tree and follow the established initialization pattern using byte arrays from constants.
152-155
: Consistent keypair initialization for regeneration function.The second function properly mirrors the keypair initialization pattern, ensuring consistency between the two initialization methods.
sdk-libs/program-test/src/accounts/test_accounts.rs (2)
83-94
: LGTM! Second v2 state tree properly added for local test validator.The addition of a second
StateMerkleTreeAccountsV2
entry follows the existing pattern and maintains consistency with the hardcoded pubkey approach used elsewhere in the local test validator setup.
137-160
: LGTM! Second v2 state tree properly integrated with keypair-based approach.The program test implementation correctly uses the new keypairs (
BATCHED_STATE_MERKLE_TREE_TEST_KEYPAIR_2
, etc.) to derive pubkeys for the second v2 state tree, maintaining consistency with the existing keypair-based pattern.sdk-libs/sdk/src/lib.rs (3)
106-106
: LGTM! LightAccount re-export added consistently.The re-export follows the existing pattern and makes the
LightAccount
type available at the SDK root level.
109-110
: LGTM! Compressible module properly documented and exposed.The new
compressible
module is well-documented as "SDK helpers for compressing and decompressing PDAs" and aligns with the PR objectives.
122-122
: LGTM! Size trait re-export added.The
Size
trait re-export follows the existing pattern and supports the compressible account functionality.sdk-libs/program-test/src/accounts/initialize.rs (2)
181-191
: LGTM! Second v2 state tree initialization follows established pattern.The initialization of the second v2 state tree correctly uses the new keypairs (
batched_state_merkle_tree_2
,batched_output_queue_2
,batched_cpi_context_2
) and reuses the same configuration parameters, which is appropriate for test infrastructure.
226-237
: LGTM! TestAccounts properly updated to include both v2 state trees.The returned
TestAccounts
struct correctly includes both v2 state trees in thev2_state_trees
vector, maintaining consistency with the initialization logic above.sdk-libs/program-test/src/utils/simulation.rs (1)
9-36
: LGTM! Well-implemented test utility for compute unit measurement.The
simulate_cu
function is properly implemented as a test utility:
- Correctly handles transaction creation and signing
- Appropriately uses
panic!
for error handling in test context- Cleanly extracts compute units from simulation result
- Provides useful functionality for performance testing
The function serves the expanded testing infrastructure well and will be valuable for measuring transaction costs in compressible account tests.
.github/workflows/light-examples-tests.yml (3)
7-8
: LGTM! Trigger paths updated for new SDK test structure.The addition of
program-tests/sdk-anchor-test/**
andprogram-tests/sdk-pinocchio-test/**
paths aligns with the workspace reorganization and ensures CI triggers for the new SDK test directories.Also applies to: 15-16
31-32
: LGTM! Job rename reflects updated scope.The rename from
system-programs
toexamples-tests
better reflects the job's purpose and aligns with the workflow separation strategy.
54-55
: LGTM! Matrix simplification aligns with workspace changes.The removal of
sdk-test-program
and retention of onlysdk-anchor-test-program
with its two sub-tests is consistent with the removal of theprogram-tests/sdk-test
package from the workspace.sdk-libs/light-compressible-client/Cargo.toml (1)
1-27
: Well-structured client library manifest.The Cargo.toml follows good practices with proper workspace dependency usage, appropriate feature flags for conditional Anchor support, and clear package metadata. The version aligns with the workspace ecosystem.
Cargo.toml (3)
23-23
: Good addition of the compressible client library.Adding
sdk-libs/light-compressible-client
to the workspace members enables proper dependency management for the new client instruction builders.
47-49
: Well-organized test package additions.The addition of specialized test packages (
anchor-compressible
,anchor-compressible-derived
,native-compressible
) provides comprehensive coverage for different compressible account scenarios, replacing the genericprogram-tests/sdk-test
.
92-92
: Appropriate dependency additions.The addition of
solana-rent
andlight-compressible-client
workspace dependencies supports the new compressible accounts functionality with proper version management.Also applies to: 157-157
sdk-libs/program-test/src/accounts/test_keypairs.rs (3)
17-19
: Good extension of test keypairs for v2 batched trees.Adding the second set of batched state merkle tree keypairs (
batched_state_merkle_tree_2
,batched_output_queue_2
,batched_cpi_context_2
) properly extends the test infrastructure to support multiple v2 compression state trees.
44-51
: Proper initialization of new keypairs.The constructor correctly initializes the new keypairs using the established pattern with
Keypair::from_bytes()
and appropriate error handling.
167-189
: Well-formatted deterministic keypair constants.The new keypair constants follow the existing format with proper 64-byte arrays and descriptive comments including the public keys. This maintains consistency with the existing test keypair management approach.
sdk-tests/anchor-compressible/Cargo.toml (4)
7-9
: Correct Anchor program crate configuration.The crate type configuration (
["cdylib", "lib"]
) and crate name are properly set for an Anchor program that can be built as both a Solana program and a Rust library.
11-18
: Well-configured Anchor feature flags.The feature configuration includes all standard Anchor features (
no-entrypoint
,cpi
,idl-build
) and properly enables IDL building for bothanchor-lang
andlight-sdk
.
21-29
: Appropriate dependency selection.The dependencies correctly include Light SDK with
anchor
,idl-build
, andv2
features, along with other necessary Light Protocol crates for compressible account functionality.
31-38
: Good test dependency configuration.The dev dependencies include the new
light-compressible-client
withanchor
feature and other testing utilities, enabling comprehensive testing of the Anchor compressible functionality.sdk-tests/anchor-compressible/CONFIG.md (6)
1-12
: Comprehensive overview of the compressible config system.The introduction clearly explains the purpose and benefits of the global compression parameter management system.
15-32
: Well-documented config initialization instruction.The documentation provides clear details about the
initialize_compression_config
instruction, including the critical security requirement that only the program's upgrade authority can call it.
34-48
: Clear update instruction documentation.The
update_compression_config
instruction is well-documented with proper emphasis on authority requirements and optional parameter updates.
69-82
: Strong security model documentation.The security model section clearly outlines the authority checks, validation requirements, and compression delay enforcement, which are critical for safe operation.
76-82
: Practical deployment guidance.The deployment process provides actionable steps with emphasis on the importance of immediate config initialization after program deployment.
87-94
: Good backward compatibility documentation.Documenting the legacy instructions and their limitations while encouraging use of the new config-based approach ensures smooth migration.
sdk-tests/native-compressible/src/update_config.rs (1)
31-37
: LGTM! Well-structured instruction data.The
UpdateConfigInstructionData
struct appropriately usesOption
types for all fields, allowing selective updates. The Borsh derive implementations are correct for serialization.sdk-tests/anchor-compressible-derived/README.md (1)
1-279
: Excellent comprehensive documentation!This README provides thorough coverage of the
add_compressible_instructions
macro usage, including:
- Clear setup examples with proper imports and macro application
- Detailed explanations of generated instructions and their purposes
- Complete TypeScript client examples
- Important validation rules for address spaces
- Migration guidance from manual implementations
- Best practices for configuration and operations
The documentation will significantly help developers adopt the compressible account system correctly.
.github/workflows/sdk-tests.yml (1)
71-89
: LGTM! Well-structured sequential testing approach.The sequential building and testing of SDK programs is appropriate to ensure .so files are available for each test. The increased stack size and disabled warnings for SBF compilation are necessary workarounds for the compilation environment.
sdk-tests/native-compressible/src/compress_dynamic_pda.rs (1)
64-72
: Good use of scoped borrowing.The scoped borrowing pattern ensures the account data borrow is dropped before the compression operation, preventing borrow checker issues. This is a good practice for handling mutable references.
sdk-libs/sdk/src/account.rs (2)
83-85
: Well-designed Size trait and implementation.The
Size
trait provides a clean abstraction for account size calculation, and the conditional implementation in thesize()
method correctly constrains it to types that implement the trait.Also applies to: 287-293
168-212
: Good constructor design for empty compressed accounts.The
new_mut_without_data
constructor appropriately handles the case of compressing PDAs with empty data, usingDEFAULT_DATA_HASH
and default account data. The documentation clearly explains the use case.sdk-tests/native-compressible/tests/test_config.rs (1)
15-16
: ADDRESS_SPACE and RENT_RECIPIENT should use different pubkeysBoth constants are set to the same pubkey value, which seems incorrect as they serve different purposes. ADDRESS_SPACE is typically used for PDA derivation while RENT_RECIPIENT receives rent from compressed accounts.
Please verify if this is intentional or if different pubkeys should be used for these constants.
sdk-tests/anchor-compressible/tests/test_config.rs (1)
1-12
: Excellent test coverage and documentationThe test suite is comprehensive and well-documented, covering all the important validation scenarios listed in the header comment. The security-focused tests (discriminator attacks, corruption scenarios) are particularly valuable.
sdk-libs/macros/src/native_compressible.rs (1)
1-8
: Well-structured macro implementationThe macro is well-organized with clear separation between parsing, structure generation, and processor generation. The generated code follows Solana program conventions and includes proper error handling.
sdk-libs/sdk/src/compressible/mod.rs (1)
20-24
: LGTM!The config module exports are well-organized and include all necessary constants and functions for compression configuration management.
sdk-libs/program-test/src/program_test/compressible_setup.rs (1)
22-29
: Confirm BPF loader upgradeable ProgramData layoutPlease double-check that the mock byte layout really matches
UpgradeableLoaderState::ProgramData
in the Solana BPF upgradeable loader:• Bytes 0–4: enum discriminator = 3 (ProgramData)
• Bytes 4–12: u64 slot (little-endian)
• Byte 12: COption tag (0 = None, 1 = Some)
• Bytes 13–45: authority pubkey (when tag = 1)Ensure this aligns with the implementation in
solana_program::bpf_loader_upgradeable::UpgradeableLoaderState
(see the official docs at https://docs.rs/solana-program/latest/solana_program/bpf_loader_upgradeable/enum.UpgradeableLoaderState.html). If the enum variant values or the COption serialization have changed, adjust the offsets or tag size accordingly.sdk-tests/native-compressible/src/lib.rs (1)
15-17
: LGTM!The program ID constant and CPI signer derivation follow proper Solana patterns using the Light macros.
sdk-tests/anchor-compressible/tests/test_instruction_builders.rs (2)
310-314
: LGTM!The validation tests properly verify error conditions and error message content, ensuring robust input validation.
98-232
: Excellent test coverage for instruction builders.The tests thoroughly validate instruction structure, account permissions, and data integrity following Solana SDK patterns.
sdk-tests/native-compressible/src/create_config.rs (1)
47-67
: LGTM!The data structures are well-designed with appropriate Borsh serialization traits and clear field names. The type alias for backward compatibility is a thoughtful addition.
sdk-tests/anchor-compressible-derived/Cargo.toml (1)
1-47
: LGTM! Package restructuring is well-organized.The transformation from a generic "sdk-test" to the specific "anchor-compressible-derived" package is cleanly executed. The workspace-managed dependencies ensure version consistency across the monorepo, and the feature flags properly support both IDL building and CPI functionality.
sdk-libs/sdk/src/compressible/compression_info.rs (1)
1-87
: Well-structured compression state management.The
CompressionInfo
struct andHasCompressionInfo
trait provide a clean interface for managing compression state and timing. The use ofsaturating_sub
inslots_until_compressible
prevents underflow, and the state transitions are clearly defined.sdk-libs/macros/src/compressible.rs (2)
41-453
: Comprehensive macro implementation with good error handling.The macro successfully generates all necessary boilerplate for compressible accounts including:
- Enum variants with proper trait implementations
- Config management instructions with authority checks
- Idempotent decompression supporting multiple accounts
- Compress instructions with config validation
- Proper error codes
The generated code follows Anchor best practices and integrates well with the Light SDK.
456-511
: Well-implemented derive macro for HasCompressionInfo.The derive macro correctly validates the presence of a
compression_info
field and generates the trait implementation with proper error handling for the Option type.sdk-libs/macros/CHANGELOG.md (1)
1-94
: Excellent changelog with comprehensive migration guide.The changelog clearly documents the breaking changes and provides detailed migration instructions with code examples. The rationale for removing auto-generated create instructions (recognizing that they typically need custom business logic) is well-explained. The migration guide will help developers adapt to the new patterns effectively.
sdk-tests/native-compressible/tests/test_compressible_flow.rs (1)
33-388
: Excellent comprehensive test coverage!The test thoroughly validates the complete lifecycle of compressible accounts including creation, compression, decompression, and recompression with proper delay checks and state verification.
sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs (7)
31-106
: LGTM! Well-structured test for multi-account decompression.The test properly covers the creation and decompression of two different account types (UserRecord and GameSession), with appropriate setup and assertions.
108-169
: LGTM! Comprehensive test for account lifecycle with timing constraints.The test effectively validates the compression delay mechanism by testing both failure (due to insufficient slot delay) and success cases. Good coverage of error handling.
171-270
: LGTM! Well-implemented helper for creating compressed accounts.The function properly handles account creation with compression, includes CU simulation for performance tracking, and verifies the final state correctly.
272-462
: LGTM! Comprehensive helper for testing multi-account decompression.The function thoroughly verifies the decompression process for multiple accounts, including proper state transitions and compression metadata. The many parameters are justified given the complexity of the operation being tested.
867-991
: LGTM! Important security test for idempotent decompression.This test effectively validates that the decompression operation is idempotent, preventing potential double-decompression attacks while maintaining data integrity.
993-1060
: LGTM! Important test for cross-state-tree decompression.This test validates a critical scenario where accounts from different state trees are decompressed together, ensuring the system properly handles mixed sources.
1062-1165
: LGTM! Thorough test for compression info updates.The test effectively validates that compression metadata (specifically last_written_slot) is properly updated when modifying decompressed accounts, which is crucial for maintaining accurate state tracking.
sdk-libs/sdk/src/compressible/compress_account_on_init.rs (3)
25-67
: LGTM! Clean wrapper implementation.The function properly delegates to the batch helper with single-element arrays, following DRY principles while maintaining a convenient single-account API.
92-169
: LGTM! Well-implemented batch compression with security considerations.The function properly validates inputs, prevents lamports funding attacks by setting compression info early, and efficiently processes multiple accounts. Good defensive programming.
341-371
: LGTM! Proper native account closing implementation.The function correctly implements Solana's account closing pattern: transfer lamports, assign to system program, and reallocate to zero. Good error handling and use of checked arithmetic.
sdk-tests/anchor-compressible/src/lib.rs (6)
17-19
: LGTM! Proper program setup.Program ID and CPI signer are correctly defined following Anchor conventions.
27-68
: LGTM! Well-implemented account creation with compression.The instruction properly validates configuration, sets account data, and compresses in a single transaction. Good security check on rent recipient.
131-231
: LGTM! Robust multi-account decompression implementation.The instruction properly handles multiple account types through variant dispatching, maintains idempotency, and efficiently batches CPI calls. Good error handling and validation.
293-380
: LGTM! Efficient multi-account creation and compression.The instruction effectively creates and compresses two different account types in a single transaction, properly using the SDK's batching capabilities.
654-744
: LGTM! Well-structured compressible account definitions.The account structs properly implement all required traits for compression support. Good documentation explaining the manual steps needed (compression_info field, #[skip] attribute, #[hash] for large fields).
581-644
: LGTM! Well-designed variant enum for account polymorphism.The enum properly implements all required traits by delegating to the wrapped account types, enabling type-safe handling of multiple account types in decompression operations.
sdk-libs/macros/src/lib.rs (4)
145-194
: LGTM! Documentation improvements for LightHasherThe enhanced documentation with examples and the new
#[hash]
attribute explanation significantly improves the usability of the macro. The rename fromlight_hasher_derive
toderive_light_hasher
follows better naming conventions.
196-229
: LGTM! Well-documented HasCompressionInfo derive macroThe new derive macro is well-structured with clear documentation, examples, and requirements. The requirement to mark the
compression_info
field with#[skip]
to exclude it from hashing is properly documented.
231-251
: LGTM! Clear documentation for add_compressible_instructions macroThe macro is well-documented with clear usage instructions. The critical requirement to place it BEFORE the
#[program]
attribute is properly emphasized, which is essential for correct Anchor macro processing.
278-289
: LGTM! Proper deprecation handlingGood practice to maintain the deprecated
compressible
macro for backward compatibility while clearly directing users to use#[add_compressible_instructions]
instead.sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (2)
145-204
: LGTM! Thorough testing of compression delay constraintsExcellent test coverage for the compression delay mechanism. The test properly validates:
- Error case when attempting to compress too soon (slot delay constraint)
- Success case after sufficient slot advancement
- Specific error code validation (Custom(16001))
1022-1144
: LGTM! Critical security test for idempotent decompressionExcellent test for validating idempotent decompression behavior. This is crucial for preventing potential security issues where multiple decompression attempts could corrupt account state. The test properly verifies that:
- Second decompression succeeds idempotently
- Account state remains uncorrupted
- Data integrity is maintained
sdk-tests/anchor-compressible-derived/src/lib.rs (1)
71-89
: LGTM! Proper compression info updateThe implementation correctly updates the compression info's last written slot after modifying the account data. The comment clearly indicates this is a manual operation required for compressible accounts.
sdk-libs/light-compressible-client/src/lib.rs (2)
1-4
: LGTM! Clean conditional compilation setupExcellent use of feature flags and type aliases to support both Anchor and Borsh serialization with a unified API.
318-323
: LGTM! Proper input validationGood defensive programming with clear, descriptive error messages for length mismatches.
// TODO: Find cleaner way to do this. | ||
tree_info: packed_tree_infos | ||
.state_trees | ||
.as_ref() | ||
.unwrap() | ||
.packed_tree_infos | ||
.iter() | ||
.find(|pti| { | ||
pti.queue_pubkey_index == queue_index | ||
&& pti.leaf_index == compressed_account.leaf_index | ||
}) | ||
.copied() | ||
.ok_or("Matching PackedStateTreeInfo (queue_pubkey_index + leaf_index) not found")?, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Address TODO: Refactor tree info matching logic
The TODO comment correctly identifies the need for a cleaner implementation. This duplicates the logic from compress_account
method. As suggested above, extract this into a shared helper method.
Would you like me to create an issue to track this refactoring task?
🤖 Prompt for AI Agents
In sdk-libs/light-compressible-client/src/lib.rs around lines 346 to 358, the
tree_info matching logic duplicates code from the compress_account method and is
marked with a TODO for cleanup. Refactor by extracting this matching logic into
a shared helper function that takes the necessary parameters and returns the
matching PackedStateTreeInfo. Replace the current inline code with a call to
this new helper to improve code reuse and clarity.
let light_discriminator_impl: Item = syn::parse_quote! { | ||
impl light_sdk::LightDiscriminator for CompressedAccountVariant { | ||
const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly | ||
const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Clarify the purpose of hardcoded discriminator.
The hardcoded zero discriminator with a comment saying "won't be used directly" is confusing and could lead to misunderstandings.
Consider adding a more descriptive comment or using a different approach:
impl light_sdk::LightDiscriminator for CompressedAccountVariant {
- const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly
+ // The enum itself doesn't have a discriminator - each variant uses its struct's discriminator
+ const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder - actual discriminators come from variant structs
const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR;
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
let light_discriminator_impl: Item = syn::parse_quote! { | |
impl light_sdk::LightDiscriminator for CompressedAccountVariant { | |
const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly | |
const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; | |
} | |
}; | |
let light_discriminator_impl: Item = syn::parse_quote! { | |
impl light_sdk::LightDiscriminator for CompressedAccountVariant { | |
// The enum itself doesn't have a discriminator - each variant uses its struct's discriminator | |
const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder - actual discriminators come from variant structs | |
const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; | |
} | |
}; |
🤖 Prompt for AI Agents
In sdk-libs/macros/src/compressible.rs around lines 98 to 103, the hardcoded
zero array for LIGHT_DISCRIMINATOR with the comment "This won't be used
directly" is unclear and may confuse readers. Update the comment to clearly
explain why the zero value is used and under what circumstances it is safe or
intended to be ignored, or alternatively, replace the hardcoded zero with a more
meaningful placeholder or constant that better reflects its purpose.
// Generate LightDiscriminator implementation with correct constants and method signature | ||
let light_discriminator_impl: Item = syn::parse_quote! { | ||
impl light_sdk::LightDiscriminator for CompressedAccountVariant { | ||
const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Default discriminator for enum |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Zero discriminator for enum could cause conflicts
The CompressedAccountVariant
enum uses [0; 8]
as its discriminator. This could potentially conflict with account types that also use a zero discriminator, leading to deserialization issues.
Consider using a unique discriminator for the enum:
- const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Default discriminator for enum
+ // Use a unique discriminator to avoid conflicts
+ const LIGHT_DISCRIMINATOR: [u8; 8] = [0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0x01];
Alternatively, consider if the enum actually needs a discriminator since it's not directly serialized as an account.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Default discriminator for enum | |
// Use a unique discriminator to avoid conflicts | |
const LIGHT_DISCRIMINATOR: [u8; 8] = [0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0x01]; |
🤖 Prompt for AI Agents
In sdk-libs/macros/src/native_compressible.rs at line 153, the enum
CompressedAccountVariant uses a zeroed array [0; 8] as its discriminator, which
may conflict with other account types using the same zero discriminator and
cause deserialization issues. To fix this, assign a unique, non-zero
discriminator value that does not overlap with existing account discriminators,
or evaluate if the enum requires a discriminator at all and remove it if
unnecessary to avoid conflicts.
pub fn process_update_config( | ||
accounts: &[AccountInfo], | ||
instruction_data: &[u8], | ||
) -> Result<(), LightSdkError> { | ||
let mut instruction_data = instruction_data; | ||
let instruction_data = UpdateConfigInstructionData::deserialize(&mut instruction_data) | ||
.map_err(|_| LightSdkError::Borsh)?; | ||
|
||
// Get accounts | ||
let config_account = &accounts[0]; | ||
let authority = &accounts[1]; | ||
|
||
process_update_compression_config( | ||
config_account, | ||
authority, | ||
instruction_data.new_update_authority.as_ref(), | ||
instruction_data.new_rent_recipient.as_ref(), | ||
instruction_data.new_address_space, | ||
instruction_data.new_compression_delay, | ||
&crate::ID, | ||
)?; | ||
|
||
Ok(()) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add bounds checking for account access.
The function accesses accounts by hardcoded indices without validating the account count first, which could cause a panic if fewer accounts are provided.
pub fn process_update_config(
accounts: &[AccountInfo],
instruction_data: &[u8],
) -> Result<(), LightSdkError> {
+ // Validate minimum account count
+ if accounts.len() < 2 {
+ return Err(LightSdkError::AccountNotFound);
+ }
+
let mut instruction_data = instruction_data;
let instruction_data = UpdateConfigInstructionData::deserialize(&mut instruction_data)
.map_err(|_| LightSdkError::Borsh)?;
// Get accounts
let config_account = &accounts[0];
let authority = &accounts[1];
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
pub fn process_update_config( | |
accounts: &[AccountInfo], | |
instruction_data: &[u8], | |
) -> Result<(), LightSdkError> { | |
let mut instruction_data = instruction_data; | |
let instruction_data = UpdateConfigInstructionData::deserialize(&mut instruction_data) | |
.map_err(|_| LightSdkError::Borsh)?; | |
// Get accounts | |
let config_account = &accounts[0]; | |
let authority = &accounts[1]; | |
process_update_compression_config( | |
config_account, | |
authority, | |
instruction_data.new_update_authority.as_ref(), | |
instruction_data.new_rent_recipient.as_ref(), | |
instruction_data.new_address_space, | |
instruction_data.new_compression_delay, | |
&crate::ID, | |
)?; | |
Ok(()) | |
} | |
pub fn process_update_config( | |
accounts: &[AccountInfo], | |
instruction_data: &[u8], | |
) -> Result<(), LightSdkError> { | |
// Validate minimum account count | |
if accounts.len() < 2 { | |
return Err(LightSdkError::AccountNotFound); | |
} | |
let mut instruction_data = instruction_data; | |
let instruction_data = UpdateConfigInstructionData::deserialize(&mut instruction_data) | |
.map_err(|_| LightSdkError::Borsh)?; | |
// Get accounts | |
let config_account = &accounts[0]; | |
let authority = &accounts[1]; | |
process_update_compression_config( | |
config_account, | |
authority, | |
instruction_data.new_update_authority.as_ref(), | |
instruction_data.new_rent_recipient.as_ref(), | |
instruction_data.new_address_space, | |
instruction_data.new_compression_delay, | |
&crate::ID, | |
)?; | |
Ok(()) | |
} |
🤖 Prompt for AI Agents
In sdk-tests/native-compressible/src/update_config.rs lines 6 to 29, the
function accesses accounts[0] and accounts[1] without checking if the accounts
slice has at least two elements, risking a panic. Add a check at the start of
the function to verify accounts.len() is at least 2, and return an appropriate
error if not, before accessing the accounts by index.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (1)
.github/workflows/sdk-tests.yml (1)
30-90
: Add explicit permissions to the workflow.The workflow should include an explicit permissions block to follow security best practices. This limits the GITHUB_TOKEN permissions to only what's necessary.
Add this permissions block after line 30:
jobs: sdk-tests: name: sdk-tests + permissions: + contents: read if: github.event.pull_request.draft == false
🧹 Nitpick comments (10)
sdk-tests/native-compressible/tests/test_compressible_flow.rs (2)
4-4
: Remove unused importcore::panic
.The
panic
macro is available by default from the standard library. This import is unnecessary.-use core::panic; -
111-114
: Remove or use the unused_test_data
parameter.The parameter
_test_data
is passed to the function but never used. Either implement its usage or remove it from the function signature and call sites.async fn create_and_compress_account( rpc: &mut LightProgramTest, payer: &Keypair, - _test_data: [u8; 31], ) -> Pubkey {
sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (2)
204-303
: Consider extracting common Light Protocol setup code.Both
test_create_record
andtest_create_game_session
have identical setup code for Light Protocol accounts (lines 213-217 and 314-318). Consider extracting this into a shared helper function to reduce duplication.fn setup_light_protocol_accounts(program_id: &Pubkey) -> PackedAccounts { let mut remaining_accounts = PackedAccounts::default(); let system_config = SystemAccountMetaConfig::new(*program_id); remaining_accounts.add_system_accounts(system_config); remaining_accounts }Also applies to: 305-420
281-281
: Remove or convert debug println statements.Debug println statements should be removed or converted to proper test logging before merging.
Either remove these println statements or use a proper logging framework if debug output is needed for test diagnostics.
Also applies to: 414-414, 505-506, 524-524, 712-712, 861-862, 975-976
sdk-tests/anchor-compressible-derived/src/lib.rs (6)
35-37
: Improve error handling for config loading.The error mapping loses the original error information from
load_checked
, which could make debugging difficult.- let config = CompressibleConfig::load_checked(&ctx.accounts.config, &crate::ID) - .map_err(|_| anchor_lang::error::ErrorCode::AccountDidNotDeserialize)?; + let config = CompressibleConfig::load_checked(&ctx.accounts.config, &crate::ID) + .map_err(|e| { + msg!("Failed to load compression config: {:?}", e); + anchor_lang::error::ErrorCode::AccountDidNotDeserialize + })?;
43-46
: Simplify compression info initialization.The nested error mapping is unnecessarily complex and loses error context.
- user_record.compression_info = Some( - CompressionInfo::new_decompressed() - .map_err(|_| anchor_lang::error::ErrorCode::AccountDidNotDeserialize)?, - ); + user_record.compression_info = Some(CompressionInfo::new_decompressed()?);This assumes
CompressionInfo::new_decompressed()
returns a compatible error type. If not, consider creating a custom error variant for compression info initialization failures.
102-103
: Inconsistent error handling pattern.Same issue as line 35-37. Consider extracting a helper function for consistent config loading.
+fn load_compression_config(config_account: &AccountInfo, program_id: &Pubkey) -> Result<CompressibleConfig> { + CompressibleConfig::load_checked(config_account, program_id) + .map_err(|e| { + msg!("Failed to load compression config: {:?}", e); + anchor_lang::error::ErrorCode::AccountDidNotDeserialize.into() + }) +} - let config = CompressibleConfig::load_checked(&ctx.accounts.config, &crate::ID) - .map_err(|_| anchor_lang::error::ErrorCode::AccountDidNotDeserialize)?; + let config = load_compression_config(&ctx.accounts.config, &crate::ID)?;
115-118
: Duplicate compression info initialization pattern.Same pattern as lines 43-46. If the refactoring suggestion is applied there, apply it here too for consistency.
128-131
: Another duplicate compression info initialization.Third occurrence of the same pattern. Consider extracting this into a helper function.
+fn init_compression_info() -> Result<CompressionInfo> { + CompressionInfo::new_decompressed() + .map_err(|e| { + msg!("Failed to initialize compression info: {:?}", e); + anchor_lang::error::ErrorCode::AccountDidNotDeserialize.into() + }) +} // Then use: compression_info = Some(init_compression_info()?);
297-301
: Consider adding more specific error types.The single
InvalidRentRecipient
error might be insufficient for comprehensive error handling. Consider adding errors for compression failures, config validation issues, etc.#[error_code] pub enum ErrorCode { #[msg("Rent recipient does not match config")] InvalidRentRecipient, + #[msg("Failed to initialize compression info")] + CompressionInfoInitFailed, + #[msg("Invalid compression configuration")] + InvalidCompressionConfig, }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
Cargo.lock
is excluded by!**/*.lock
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (14)
.github/workflows/sdk-tests.yml
(1 hunks)program-tests/sdk-anchor-test/Anchor.toml
(1 hunks)program-tests/sdk-anchor-test/package.json
(2 hunks)programs/package.json
(1 hunks)scripts/format.sh
(1 hunks)sdk-tests/anchor-compressible-derived/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_config.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/native-compressible/Cargo.toml
(1 hunks)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/tests/test_compressible_flow.rs
(1 hunks)sdk-tests/native-compressible/tests/test_config.rs
(1 hunks)sdk-tests/package.json
(1 hunks)
✅ Files skipped from review due to trivial changes (4)
- program-tests/sdk-anchor-test/package.json
- program-tests/sdk-anchor-test/Anchor.toml
- scripts/format.sh
- sdk-tests/package.json
🚧 Files skipped from review as they are similar to previous changes (4)
- sdk-tests/native-compressible/Cargo.toml
- sdk-tests/native-compressible/tests/test_config.rs
- sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
- sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs
🧰 Additional context used
🪛 GitHub Check: CodeQL
.github/workflows/sdk-tests.yml
[warning] 32-89: Workflow does not contain permissions
Actions job or workflow does not limit the permissions of the GITHUB_TOKEN. Consider setting an explicit permissions block, using the following as a minimal starting point: {{contents: read}}
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: sdk-tests
- GitHub Check: cli-v1
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: lint
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: Test sdk-libs
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: Test program-libs-slow
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: stateless-js-v1
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test program-libs-fast
- GitHub Check: Forester e2e test
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: cli-v2
- GitHub Check: stateless-js-v2
🔇 Additional comments (17)
sdk-tests/native-compressible/tests/test_compressible_flow.rs (3)
32-109
: Well-structured integration test!The test comprehensively covers the full lifecycle of compressible accounts with proper slot warping to handle compression delays.
204-266
: Clean implementation of account decompression.Good use of idempotent decompression and proper handling of compression info initialization.
326-391
: Thorough verification functions.Both verification functions comprehensively check account states with appropriate assertions for lamports, data, and compression info.
programs/package.json (1)
15-15
: LGTM! Script updated to match new package structure.The update correctly reflects the replacement of
sdk-test
withnative-compressible
package..github/workflows/sdk-tests.yml (1)
71-89
: Well-configured build and test steps.Good use of increased stack size for SBF compilation and sequential test execution to ensure proper build artifacts. The explanatory comments are helpful.
sdk-tests/anchor-compressible/tests/test_config.rs (2)
33-125
: Excellent test coverage for configuration validation.The tests thoroughly validate initialization, authority checks, and address space constraints. Good inclusion of edge cases like empty and multiple address spaces.
378-628
: Outstanding security test coverage!These discriminator attack tests are exemplary - they systematically validate that corruption of critical fields (discriminator, version, address_space, owner) is properly detected and rejected. The careful state restoration between tests ensures test isolation.
sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (3)
30-142
: Well-structured multi-account decompression test.Good coverage of creating and decompressing multiple accounts both separately and combined. The test effectively validates the multi-account decompression functionality.
1020-1141
: Important idempotent decompression test.This test effectively validates that the decompression operation is truly idempotent and doesn't corrupt account state on repeated attempts. Critical for security.
1143-1222
: Excellent test for state tree flexibility.This test effectively validates that accounts created in different state trees can be decompressed together, which is crucial for system scalability and flexibility.
sdk-tests/anchor-compressible-derived/src/lib.rs (7)
1-18
: LGTM! Clean imports and proper setup.The imports are well-organized and the program ID declaration with CPI signer derivation follows best practices for Light Protocol integration.
283-283
: Another space calculation to verify.Similar concern as line 238. The space calculation
space = 8 + 10 + 8 + 32 + 4 + 32 + 8 + 9 + 8
should be verified against theGameSession
struct layout.
310-319
: LGTM! Well-structured compression parameters.The
CompressionParams
struct properly encapsulates all necessary parameters for batch compression operations with clear naming and appropriate types.
82-86
: Manualset_last_written_slot()
is the intended pattern
Verified across the SDK and both native/Anchor test suites that there is no automatic slot update—every implementation explicitly callsset_last_written_slot()
. The manual update insdk-tests/anchor-compressible-derived/src/lib.rs
is correct; no changes are required.
238-238
: Confirm space calculation insdk-tests/anchor-compressible-derived/src/lib.rs:238
The sum
space = 8 + 32 + 4 + 32 + 8 + 10
includes an unexplained
+ 10
. We weren’t able to locate any documentation or examples that define where those 10 bytes come from (e.g., padding, anOption
discriminant, or extra fields inCompressionInfo
). Please verify that each component matches the actual struct layout:
- 8 bytes: account discriminator
- 32 bytes: (first field)
- 4 bytes: string length prefix
- 32 bytes: string content
- 8 bytes:
u64
timestamp (or equivalent)- 10 bytes: ?
To confirm, you can log the actual size at runtime:
use std::mem::size_of; println!("DerivedAccount size: {}", size_of::<DerivedAccount>());Or inspect the
CompressionInfo
and related structs directly to account for any enum tags or padding. Once you’ve determined the correct size, adjust thespace
calculation accordingly.
145-182
: Add an explicit integration test for mixed‐type batch compressionI didn’t find any existing tests or docs covering this pattern in
sdk-tests/anchor-compressible-derived/src/lib.rs
, so we need to ensure the CPI invocation handles multiple account types correctly. Please add an integration test that exercises:
- Calling
prepare_accounts_for_compression_on_init
for bothUserRecord
andGameSession
in the same batch.- Building
CpiInputs::new_with_address
with the combinedall_compressed_infos
and new address params in the expected order.- Invoking
invoke_light_system_program
and verifying it applies the proof, addresses, and state‐tree indexes to each account without misalignment.- Confirming both compressed accounts are created successfully in one CPI.
This will catch any parameter‐ordering or mixed‐type handling issues before merging.
198-198
: Re-export is correct; no changes needed.The
add_compressible_instructions(UserRecord, GameSession)
attribute on theanchor_compressible_derived
module emits both
pub enum CompressedAccountVariant { UserRecord(UserRecord), GameSession(GameSession) }
pub struct CompressedAccountData { pub meta: CompressedAccountMeta, pub data: CompressedAccountVariant, pub seeds: Vec<Vec<u8>> }
at compile time. The subsequent
pub use crate::anchor_compressible_derived::{CompressedAccountData, CompressedAccountVariant};simply re-exports those generated types for client use. No edits are required here—just ensure the crate builds successfully to confirm the macro expansion.
pub const LIGHT_CPI_SIGNER: CpiSigner = | ||
derive_light_cpi_signer!("GRLu2hKaAiMbxpkAM1HeXzks9YeGuz18SEgXEizVvPqX"); | ||
|
||
#[add_compressible_instructions(UserRecord, GameSession)] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify macro-generated instruction coverage.
The add_compressible_instructions
macro is applied to UserRecord
and GameSession
types. Ensure that all required compressible instructions are properly generated and exposed.
🏁 Script executed:
#!/bin/bash
# Search for the macro definition to understand what instructions it generates
rg -A 20 "macro_rules! add_compressible_instructions" --type rust
# Also search for any usage or tests of the generated instructions
rg "compress_record|compress_session|decompress_accounts_idempotent" --type rust -A 3
Length of output: 15337
🏁 Script executed:
#!/bin/bash
rg -n -A50 "add_compressible_instructions" --type rust sdk-tests/anchor-compressible-derived/src/lib.rs
Length of output: 4261
Ensure all macro-generated compressible instructions are exposed and tested
The #[add_compressible_instructions(UserRecord, GameSession)]
macro (line 19) emits the following Anchor handlers and account structs:
- initialize_compression_config
- update_compression_config
- compress_record
- compress_session
- decompress_accounts_idempotent
Currently, only CompressedAccountData
and CompressedAccountVariant
are re-exported (lines 198–200). To give clients full access—and to satisfy end-to-end coverage—please:
• Add explicit re-exports for each generated instruction handler and its Accounts struct:
pub use crate::anchor_compressible_derived::{CompressedAccountData, CompressedAccountVariant};
+ pub use crate::anchor_compressible_derived::{
+ initialize_compression_config, UpdateCompressionConfig,
+ compress_record, compress_session,
+ decompress_accounts_idempotent,
+ InitializeCompressionConfig, UpdateCompressionConfig,
+ CompressRecord, CompressSession, DecompressAccountsIdempotent,
+ };
• Add tests for the missing instructions:
– compress_session
– config-management ops (initialize_compression_config
/ update_compression_config
), if not already covered.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In sdk-tests/anchor-compressible-derived/src/lib.rs at line 19, the macro
#[add_compressible_instructions(UserRecord, GameSession)] generates several
instruction handlers and account structs that are not currently re-exported or
tested. To fix this, explicitly re-export all generated instruction handlers and
their corresponding Accounts structs, including initialize_compression_config,
update_compression_config, compress_record, compress_session, and
decompress_accounts_idempotent. Additionally, add tests covering
compress_session and the configuration management operations
(initialize_compression_config and update_compression_config) to ensure full
client access and end-to-end coverage.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (2)
sdk-tests/native-compressible/src/create_dynamic_pda.rs (2)
27-32
: Add bounds checking for accounts array access.The function accesses accounts array indices 0-4 without checking if sufficient accounts are provided. This could cause a panic if fewer than 5 accounts are passed.
110-110
: Validate remaining accounts length before accessing.The CpiAccounts constructor accesses
accounts[5..]
without ensuring enough accounts exist.
🧹 Nitpick comments (2)
sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs (1)
1002-1002
: Consider using consistent PDA derivation method.For consistency with other tests, consider using
derive_pda(&program_id, 0)
instead ofderive_default_pda(&program_id)
.- let config_pda = CompressibleConfig::derive_default_pda(&program_id).0; + let config_pda = CompressibleConfig::derive_pda(&program_id, 0).0;sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (1)
974-976
: Remove or explain commented CU simulation code.Either remove the commented code or provide a clear explanation for why CU simulation is being skipped in this specific case.
- // let cu = simulate_cu(rpc, &payer, &instruction).await; - // println!("DecompressSingleUserRecord CU consumed: {}", cu); - println!("skipping cu sim"); + // Note: CU simulation skipped here to avoid [provide specific reason]
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/native-compressible/src/create_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/lib.rs
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- sdk-tests/native-compressible/src/lib.rs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: lint
- GitHub Check: stateless-js-v1
- GitHub Check: stateless-js-v2
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: cli-v2
- GitHub Check: cli-v1
- GitHub Check: Forester e2e test
- GitHub Check: Test program-libs-fast
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-slow
- GitHub Check: Test sdk-libs
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: sdk-tests
🔇 Additional comments (12)
sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs (9)
1-29
: LGTM!The imports and constants are properly organized and appropriate for the test functionality.
31-106
: LGTM!Comprehensive test that validates both individual and combined account creation/decompression workflows.
108-169
: LGTM!Well-structured test that properly validates the compression delay mechanism and account lifecycle.
171-270
: LGTM!Helper function correctly implements compressed account creation with proper validations.
462-643
: LGTM!Well-implemented helper that properly handles combined account creation with appropriate validations.
645-761
: LGTM!Well-designed helper with proper error handling and state verification for compression operations.
763-864
: LGTM!Comprehensive helper function that properly validates single account decompression.
866-990
: LGTM!Excellent security test that properly validates idempotent decompression behavior to prevent double decompression attacks.
1061-1164
: LGTM!Comprehensive test that properly validates compression info updates when modifying decompressed accounts.
sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (3)
1-29
: LGTM!Imports and constants are properly configured for the anchor_compressible test suite.
305-420
: LGTM!Well-implemented helper for creating game sessions with proper validation.
1143-1222
: LGTM!Excellent test that explicitly validates decompression across different state trees.
let cu = simulate_cu(rpc, payer, &instruction).await; | ||
println!("decompress_multiple_pdas CU consumed: {}", cu); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove duplicate CU simulation.
The compute unit simulation is performed twice with identical parameters.
Remove the duplicate simulation:
let cu = simulate_cu(rpc, payer, &instruction).await;
println!("decompress_multiple_pdas CU consumed: {}", cu);
// Verify PDAs are uninitialized before decompression
let user_pda_account = rpc.get_account(*user_record_pda).await.unwrap();
assert_eq!(
user_pda_account.as_ref().map(|a| a.data.len()).unwrap_or(0),
0,
"User PDA account data len must be 0 before decompression"
);
let game_pda_account = rpc.get_account(*game_session_pda).await.unwrap();
assert_eq!(
game_pda_account.as_ref().map(|a| a.data.len()).unwrap_or(0),
0,
"Game PDA account data len must be 0 before decompression"
);
- let cu = simulate_cu(rpc, payer, &instruction).await;
- println!("decompress_multiple_pdas CU consumed: {}", cu);
let result = rpc
.create_and_send_transaction(&[instruction], &payer.pubkey(), &[payer])
.await;
Also applies to: 373-374
🤖 Prompt for AI Agents
In sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs at
lines 355-356 and 373-374, the compute unit simulation is duplicated with the
same parameters. Remove the second call to simulate_cu and the associated
println to avoid redundant computation and output.
let cu = simulate_cu(rpc, payer, &instruction).await; | ||
println!("decompress_multiple_pdas CU consumed: {}", cu); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove duplicate CU simulation.
The compute unit simulation is performed twice with identical parameters.
Remove the duplicate simulation:
let cu = simulate_cu(rpc, payer, &instruction).await;
println!("decompress_multiple_pdas CU consumed: {}", cu);
// Verify PDAs are uninitialized before decompression
let user_pda_account = rpc.get_account(*user_record_pda).await.unwrap();
assert_eq!(
user_pda_account.as_ref().map(|a| a.data.len()).unwrap_or(0),
0,
"User PDA account data len must be 0 before decompression"
);
let game_pda_account = rpc.get_account(*game_session_pda).await.unwrap();
assert_eq!(
game_pda_account.as_ref().map(|a| a.data.len()).unwrap_or(0),
0,
"Game PDA account data len must be 0 before decompression"
);
- let cu = simulate_cu(rpc, payer, &instruction).await;
- println!("decompress_multiple_pdas CU consumed: {}", cu);
let result = rpc
.create_and_send_transaction(&[instruction], &payer.pubkey(), &[payer])
.await;
Also applies to: 523-524
🤖 Prompt for AI Agents
In sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs at lines
505-506 and 523-524, the compute unit simulation is duplicated with the same
parameters. Remove the second call to simulate_cu and the associated println to
avoid redundant computation and output.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (3)
sdk-libs/macros/src/compressible.rs (2)
296-303
: Validation added correctly addresses previous review.The empty struct validation has been properly implemented, addressing the previous review comment about potential panics when accessing
first()
.
330-335
: Discriminator comment remains unclear despite previous feedback.The hardcoded zero discriminator comment is still confusing. The explanation doesn't clarify when or how this placeholder is used safely.
Consider a more descriptive comment:
impl light_sdk::LightDiscriminator for CompressedAccountVariant { - const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly + // Enum discriminator is unused - each variant uses its own struct's discriminator during dispatch + const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder value, not used in practice const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; }sdk-libs/macros/src/lib.rs (1)
5-5
: Import issue partially resolved but ItemMod still needed.While
ItemMod
was removed from the import, it's still being used in the new macro functions at lines 246, 271, 320, and 322. The import needs to be added back.-use syn::{parse_macro_input, DeriveInput, ItemStruct}; +use syn::{parse_macro_input, DeriveInput, ItemMod, ItemStruct};
🧹 Nitpick comments (4)
sdk-libs/macros/src/compressible.rs (4)
33-55
: Consider more robust error handling for malformed attributes.The function assumes well-formed
#[instruction(...)]
attributes but could fail silently on malformed input. The nested meta parsing could benefit from more explicit error handling.fn extract_instruction_param_names(attrs: &[Attribute]) -> Vec<String> { for attr in attrs { if attr.path().is_ident("instruction") { let mut param_names = Vec::new(); - let _ = attr.parse_nested_meta(|meta| { + if let Err(e) = attr.parse_nested_meta(|meta| { // Extract the parameter name from the path if let Some(ident) = meta.path.get_ident() { param_names.push(ident.to_string()); } // Skip the type if present (after colon) if meta.input.peek(Token![:]) { meta.input.parse::<Token![:]>()?; meta.input.parse::<syn::Type>()?; } Ok(()) - }); + }) { + // Log or handle parsing error appropriately + continue; + } if !param_names.is_empty() { return param_names; } } } vec!["account_data".to_string()] // Default fallback }
196-230
: Validate parameter conversion logic more thoroughly.The parameter conversion logic makes assumptions about naming patterns that may not hold universally. Consider adding validation or more flexible pattern matching.
fn convert_single_seed_parameter(seed: Expr, _target_type: &Ident) -> Result<Expr> { // Use visitor pattern to find and replace parameter references struct ParameterConverter { converted: bool, + errors: Vec<String>, } impl visit_mut::VisitMut for ParameterConverter { fn visit_expr_field_mut(&mut self, field_expr: &mut syn::ExprField) { // Look for expressions like account_data.field or similar parameter patterns if let syn::Expr::Path(base_path) = field_expr.base.as_ref() { if let Some(ident) = base_path.path.get_ident() { let ident_str = ident.to_string(); // Check for various parameter naming patterns if ident_str.ends_with("_data") || ident_str == "account_data" || ident_str == "params" { // Replace with solana_account *field_expr.base = syn::parse_quote!(solana_account); self.converted = true; + } else if !["self", "ctx"].contains(&ident_str.as_str()) { + // Warn about potentially unhandled parameter patterns + self.errors.push(format!("Unrecognized parameter pattern: {}", ident_str)); } } } // Continue visiting nested expressions visit_mut::visit_expr_field_mut(self, field_expr); } } let mut seed_copy = seed; - let mut converter = ParameterConverter { converted: false }; + let mut converter = ParameterConverter { + converted: false, + errors: Vec::new() + }; visit_mut::visit_expr_mut(&mut converter, &mut seed_copy); Ok(seed_copy) }
507-509
: Add more descriptive error context for account validation.The validation logic is correct but the error could be more helpful for debugging mismatched account counts.
// Validate we have matching number of PDAs, compressed accounts, and bumps - if solana_accounts.len() != compressed_accounts.len() || solana_accounts.len() != bumps.len() { - return err!(ErrorCode::InvalidAccountCount); - } + if solana_accounts.len() != compressed_accounts.len() || solana_accounts.len() != bumps.len() { + msg!("Account count mismatch: {} PDAs, {} compressed accounts, {} bumps", + solana_accounts.len(), compressed_accounts.len(), bumps.len()); + return err!(ErrorCode::InvalidAccountCount); + }
601-697
: Seeds extraction logic could be more robust.The
find_account_seeds_for_type
function may fail silently if no matching account struct is found, but the error message suggests this is expected behavior. Consider adding debug logging for troubleshooting.// Find seeds for this account type from existing account structs let seeds_info = find_account_seeds_for_type(&content.1, &struct_name)? .ok_or_else(|| syn::Error::new_spanned( &struct_name, format!( "No account struct found with 'init' constraint and seeds for type '{}'. \ Please ensure you have an account struct (with #[derive(Accounts)]) that initializes \ this account type with seeds specified in the #[account(init, seeds = [...], ...)] attribute.", struct_name ) ))?;The error message is comprehensive and helpful for developers. This looks good as implemented.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-libs/macros/src/lib.rs
(2 hunks)sdk-tests/anchor-compressible-derived/src/lib.rs
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- sdk-tests/anchor-compressible-derived/src/lib.rs
🧰 Additional context used
🧬 Code Graph Analysis (1)
sdk-libs/macros/src/lib.rs (4)
sdk-libs/macros/src/compressible.rs (3)
syn
(270-270)derive_has_compression_info
(705-760)add_compressible_instructions
(266-702)sdk-libs/macros/src/hasher/light_hasher.rs (6)
syn
(148-148)syn
(197-197)syn
(276-276)syn
(333-333)syn
(382-382)derive_light_hasher
(51-88)sdk-libs/sdk-types/src/cpi_accounts.rs (1)
cpi_signer
(39-41)sdk-libs/macros/src/program.rs (1)
program
(290-301)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: cli-v2
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: sdk-tests
- GitHub Check: lint
- GitHub Check: stateless-js-v1
- GitHub Check: Test sdk-libs
- GitHub Check: Test program-libs-slow
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-fast
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: cli-v1
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: Forester e2e test
- GitHub Check: stateless-js-v2
🔇 Additional comments (8)
sdk-libs/macros/src/compressible.rs (2)
1-31
: LGTM!The imports and type definitions are well-structured for a procedural macro implementation. The
IdentList
andSeedInfo
types appropriately encapsulate the data needed for parsing and code generation.
704-761
: LGTM! Well-implemented derive macro.The
derive_has_compression_info
function properly validates the required field and generates appropriate trait implementations. The error messages are clear and helpful for developers.sdk-libs/macros/src/lib.rs (6)
157-194
: Excellent documentation improvements for LightHasher.The updated documentation provides clear examples and explains the
#[hash]
attribute usage for fields exceeding field size limits. This significantly improves developer experience.
196-229
: Well-documented HasCompressionInfo derive macro.The documentation clearly explains the requirements and usage pattern, including the need for the
#[skip]
attribute on the compression_info field.
231-251
: Clear documentation for compressible instructions macro.The documentation provides a practical example and correctly notes the important ordering requirement relative to the
#[program]
attribute.
253-276
: Good documentation for native compressible instructions.The documentation clearly distinguishes this from the Anchor version and explains the manual dispatch requirement.
287-301
: Improved naming and documentation for CPI signer macro.The rename from
derive_light_cpi_signer_pda
toderive_light_cpi_signer
better reflects the macro's purpose, and the documentation is clear about compile-time derivation requirements.
303-325
: Light program macro converted to attribute macro correctly.The conversion from derive to attribute macro and delegation to
program::program
aligns with the overall architecture improvements.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (4)
sdk-tests/native-compressible/src/create_dynamic_pda.rs (2)
27-32
: Add bounds checking for accounts array accessThe function accesses accounts array indices 0-4 without checking if sufficient accounts are provided. This could cause a panic if fewer than 5 accounts are passed.
Add validation before accessing accounts:
+ if accounts.len() < 5 { + solana_program::msg!("Insufficient accounts provided: expected at least 5, got {}", accounts.len()); + return Err(LightSdkError::InvalidNumberOfAccounts); + } + let fee_payer = &accounts[0]; // UNCHECKED: ...caller program checks this. let solana_account = &accounts[1];
110-110
: Validate remaining accounts length before accessingThe CpiAccounts constructor accesses
accounts[5..]
without ensuring enough accounts exist.Add validation:
+ if accounts.len() < 6 { + solana_program::msg!("Insufficient accounts for CPI: expected at least 6 accounts"); + return Err(LightSdkError::InvalidNumberOfAccounts); + } let cpi_accounts_struct = CpiAccounts::new(fee_payer, &accounts[5..], crate::LIGHT_CPI_SIGNER);sdk-libs/macros/src/compressible.rs (1)
332-332
: Clarify discriminator comment as suggested.The hardcoded zero discriminator comment could still be more descriptive as noted in past reviews.
sdk-tests/native-compressible/src/lib.rs (1)
53-54
: Past issue still present: Replace panic with proper error handling.The panic in the TryFrom implementation remains unaddressed from previous reviews.
🧹 Nitpick comments (6)
program-tests/package.json (1)
9-17
: Consider collapsing the duplicated test scriptsNine nearly identical
"test-…"
entries differ only by the package name. A single parameterised helper (e.g. using annpm
“workspace” loop or annx
target) avoids repetition and keepspackage.json
maintainable.Example (minimal):
"scripts": { "test:pkg": "cargo test-sbf -p", "test": "export RUSTFLAGS=\"-D warnings\" && \ for pkg in account-compression-test system-test registry-test compressed-token-test \ system-cpi-test system-cpi-v2-test e2e-test sdk-anchor-test sdk-pinocchio-test; \ do pnpm run test:pkg \"$pkg\"; done" }Not mandatory, but will pay off as the matrix grows.
sdk-libs/macros/src/discriminator.rs (1)
58-59
: Remove misleading test assertionThe assertion checking for "account:MyAccount" in the output won't work as the comment acknowledges. This creates confusion about what the test actually validates.
Consider removing this assertion or replacing it with a test that validates the actual discriminator values differ when the feature is enabled:
- // The discriminator value will be different based on whether anchor-discriminator-compat is enabled - #[cfg(feature = "anchor-discriminator-compat")] - assert!(output.contains("account:MyAccount")); // This won't be visible in output, but logic uses it + // When anchor-discriminator-compat is enabled, the discriminator value will be different + // due to the "account:" prefix being added before hashingsdk-libs/macros/src/compressible.rs (2)
196-230
: Simplify parameter conversion logic.The visitor pattern implementation for converting seed parameters is overly complex for this use case. Consider a simpler approach.
fn convert_single_seed_parameter(seed: Expr, _target_type: &Ident) -> Result<Expr> { - // Use visitor pattern to find and replace parameter references - struct ParameterConverter { - converted: bool, - } - - impl visit_mut::VisitMut for ParameterConverter { - fn visit_expr_field_mut(&mut self, field_expr: &mut syn::ExprField) { - // Look for expressions like account_data.field or similar parameter patterns - if let syn::Expr::Path(base_path) = field_expr.base.as_ref() { - if let Some(ident) = base_path.path.get_ident() { - let ident_str = ident.to_string(); - // Check for various parameter naming patterns - if ident_str.ends_with("_data") - || ident_str == "account_data" - || ident_str == "params" - { - // Replace with solana_account - *field_expr.base = syn::parse_quote!(solana_account); - self.converted = true; - } - } - } - - // Continue visiting nested expressions - visit_mut::visit_expr_field_mut(self, field_expr); - } - } - - let mut seed_copy = seed; - let mut converter = ParameterConverter { converted: false }; - visit_mut::visit_expr_mut(&mut converter, &mut seed_copy); - - Ok(seed_copy) + // Simple string replacement approach for common patterns + let seed_str = quote!(#seed).to_string(); + if seed_str.contains("account_data") || seed_str.contains("_data") || seed_str.contains("params") { + Ok(syn::parse_quote!(solana_account)) + } else { + Ok(seed) + } }
574-582
: Consider using a more specific error type.The generic
ErrorCode
enum could benefit from more descriptive error variants for better debugging.#[error_code] pub enum ErrorCode { - #[msg("Invalid account count: PDAs and compressed accounts must match")] - InvalidAccountCount, - #[msg("Rent recipient does not match config")] - InvalidRentRecipient, + #[msg("Account count mismatch: expected {} PDAs, {} compressed accounts, and {} bumps to match")] + AccountCountMismatch, + #[msg("Rent recipient mismatch: expected {}, got {}")] + RentRecipientMismatch, }sdk-tests/anchor-compressible/src/lib.rs (2)
168-194
: Consider reducing code duplication in decompression logic.The decompression logic for UserRecord and GameSession is nearly identical and could be refactored.
+ fn process_compressed_account<T: HasCompressionInfo + LightDiscriminator>( + data: T, + compressed_data: &CompressedAccountData, + bump: u8, + solana_account: &AccountInfo, + cpi_accounts: &CpiAccounts, + rent_payer: &Signer, + address_space: Pubkey, + program_id: &Pubkey, + ) -> Result<Vec<CompressedAccountInfo>> { + let bump_slice = [bump]; + let mut seeds_refs = Vec::with_capacity(compressed_data.seeds.len() + 1); + for seed in &compressed_data.seeds { + seeds_refs.push(seed.as_slice()); + } + seeds_refs.push(&bump_slice); + + let light_account = LightAccount::<'_, T>::new_mut( + program_id, + &compressed_data.meta, + data, + )?; + + prepare_accounts_for_decompress_idempotent::<T>( + &[solana_account], + vec![light_account], + &[seeds_refs.as_slice()], + cpi_accounts, + rent_payer, + address_space, + ) + } match compressed_data.data { CompressedAccountVariant::UserRecord(data) => { - let mut seeds_refs = Vec::with_capacity(compressed_data.seeds.len() + 1); - for seed in &compressed_data.seeds { - seeds_refs.push(seed.as_slice()); - } - seeds_refs.push(&bump_slice); - - // Create LightAccount with correct UserRecord discriminator - let light_account = LightAccount::<'_, UserRecord>::new_mut( - &crate::ID, - &compressed_data.meta, - data, - )?; - - // Process this single UserRecord account - let compressed_infos = prepare_accounts_for_decompress_idempotent::<UserRecord>( - &[&solana_accounts[i]], - vec![light_account], - &[seeds_refs.as_slice()], - &cpi_accounts, - &ctx.accounts.rent_payer, - address_space, - )?; - - all_compressed_infos.extend(compressed_infos); + let compressed_infos = process_compressed_account( + data, + &compressed_data, + bump, + &solana_accounts[i], + &cpi_accounts, + &ctx.accounts.rent_payer, + address_space, + &crate::ID, + )?; + all_compressed_infos.extend(compressed_infos); } CompressedAccountVariant::GameSession(data) => { - // Build seeds refs without cloning - pre-allocate capacity - let mut seeds_refs = Vec::with_capacity(compressed_data.seeds.len() + 1); - for seed in &compressed_data.seeds { - seeds_refs.push(seed.as_slice()); - } - seeds_refs.push(&bump_slice); - - // Create LightAccount with correct GameSession discriminator - let light_account = LightAccount::<'_, GameSession>::new_mut( - &crate::ID, - &compressed_data.meta, - data, - )?; - - // Process this single GameSession account - let compressed_infos = prepare_accounts_for_decompress_idempotent::<GameSession>( - &[&solana_accounts[i]], - vec![light_account], - &[seeds_refs.as_slice()], - &cpi_accounts, - &ctx.accounts.rent_payer, - address_space, - )?; - all_compressed_infos.extend(compressed_infos); + let compressed_infos = process_compressed_account( + data, + &compressed_data, + bump, + &solana_accounts[i], + &cpi_accounts, + &ctx.accounts.rent_payer, + address_space, + &crate::ID, + )?; + all_compressed_infos.extend(compressed_infos); } }
603-603
: Clarify discriminator usage comment.Similar to the macro file, this discriminator comment could be more descriptive about when it's used.
impl LightDiscriminator for CompressedAccountVariant { - const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly + const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder - actual discriminators come from variant structs const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
Cargo.lock
is excluded by!**/*.lock
📒 Files selected for processing (23)
program-tests/package.json
(1 hunks)program-tests/system-cpi-test/tests/test_program_owned_trees.rs
(1 hunks)sdk-libs/macros/Cargo.toml
(1 hunks)sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-libs/macros/src/discriminator.rs
(2 hunks)sdk-libs/sdk/Cargo.toml
(2 hunks)sdk-libs/sdk/src/compressible/compress_account.rs
(1 hunks)sdk-libs/sdk/src/compressible/compress_account_on_init.rs
(1 hunks)sdk-libs/sdk/src/compressible/compression_info.rs
(1 hunks)sdk-libs/sdk/src/compressible/decompress_idempotent.rs
(1 hunks)sdk-libs/sdk/src/lib.rs
(2 hunks)sdk-tests/anchor-compressible-derived/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_discriminator.rs
(1 hunks)sdk-tests/native-compressible/Cargo.toml
(1 hunks)sdk-tests/native-compressible/src/compress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/create_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/create_pda.rs
(2 hunks)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/lib.rs
(1 hunks)sdk-tests/native-compressible/src/update_pda.rs
(2 hunks)sdk-tests/native-compressible/tests/test_compressible_flow.rs
(1 hunks)
✅ Files skipped from review due to trivial changes (3)
- program-tests/system-cpi-test/tests/test_program_owned_trees.rs
- sdk-tests/native-compressible/src/compress_dynamic_pda.rs
- sdk-tests/anchor-compressible/Cargo.toml
🚧 Files skipped from review as they are similar to previous changes (10)
- sdk-libs/macros/Cargo.toml
- sdk-libs/sdk/Cargo.toml
- sdk-tests/native-compressible/src/create_pda.rs
- sdk-libs/sdk/src/lib.rs
- sdk-libs/sdk/src/compressible/compress_account.rs
- sdk-tests/native-compressible/Cargo.toml
- sdk-tests/native-compressible/tests/test_compressible_flow.rs
- sdk-libs/sdk/src/compressible/compression_info.rs
- sdk-libs/sdk/src/compressible/compress_account_on_init.rs
- sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v1
- GitHub Check: Forester e2e test
- GitHub Check: sdk-tests
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: stateless-js-v2
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: cli-v2
- GitHub Check: Test sdk-libs
- GitHub Check: Test program-libs-fast
- GitHub Check: Test program-libs-slow
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: lint
🔇 Additional comments (20)
program-tests/package.json (1)
7-7
: 👍 Adding per-package test scripts improves CI granularityHaving explicit scripts for each test package makes it trivial to run a single suite locally or inside selective CI jobs. Good addition.
sdk-tests/native-compressible/src/update_pda.rs (1)
10-10
: LGTM!The refactoring from
MyCompressedAccount
toMyPdaAccount
is consistent with the broader changes in the codebase. The initialization withcompression_info: None
is appropriate for compressed accounts.Also applies to: 25-31
sdk-libs/macros/src/discriminator.rs (1)
12-21
: Well-implemented feature flag for Anchor compatibilityThe conditional discriminator generation correctly implements Anchor's scheme by prefixing "account:" when the feature is enabled. This provides seamless compatibility while maintaining backward compatibility.
sdk-libs/sdk/src/compressible/decompress_idempotent.rs (1)
38-152
: Well-implemented idempotent decompression functionThe implementation correctly handles:
- Idempotency by checking if PDAs are already initialized
- Proper validation of compressed/decompressed address relationships
- Correct space calculation and rent-exempt balance
- Proper initialization of decompressed account data with discriminator and compression info
- Clearing of compressed account data after decompression
The function addresses all the concerns from previous reviews and follows best practices.
sdk-libs/macros/src/compressible.rs (3)
53-54
: LGTM: Good error handling with fallback.The function properly handles the case where no instruction parameters are found by providing a sensible default fallback.
296-301
: LGTM: Validation properly implemented.The empty struct validation has been correctly implemented as suggested in the past review, preventing potential panics.
63-64
: Comment Not Applicable: Noinstruction_data
Access in This File
The macro implementation insdk-libs/macros/src/compressible.rs
only handles derive-time metadata parsing and does not reference or index intoinstruction_data
. The slicing and discriminator logic (instruction_data[0]
) live in generated entry-point code and test harnesses, not here—so this review comment can be ignored.Likely an incorrect or invalid review comment.
sdk-tests/anchor-compressible-derived/Cargo.toml (4)
2-4
: LGTM: Package metadata properly updated.The package name, version, and description correctly reflect the new test program's purpose.
16-17
: LGTM: IDL build features properly configured.The default features correctly include IDL build capabilities for both anchor-lang and light-sdk.
35-37
: LGTM: Dev dependencies properly updated.The new dev dependencies for light-client, light-compressible-client, and light-test-utils with appropriate feature flags align with the PR's test infrastructure changes.
23-23
: Verify light-sdk feature compatibilityI wasn’t able to locate the
light-sdk
crate manifest in the workspace to confirm these features exist and interoperate. Please manually ensure:
- The
light-sdk
Cargo.toml defines a[features]
block with:
anchor
idl-build
anchor-discriminator-compat
- Enabling all three together does not introduce missing dependencies or conflicts.
Once confirmed (or if the crate lives under a different path), update or point to the correct manifest.
sdk-tests/native-compressible/src/lib.rs (4)
63-64
: Good error handling for discriminator parsing.The error handling properly addresses the potential issue of empty instruction data and converts parsing errors appropriately.
99-106
: LGTM: Proper struct definition with compression support.The MyPdaAccount struct correctly implements the required traits and includes the compression_info field with proper skip annotation.
109-129
: LGTM: Proper HasCompressionInfo implementation.The trait implementation correctly handles the expected on-chain state where compression_info is always Some, with appropriate error messages.
131-136
: Size calculation is correct.
Self::LIGHT_DISCRIMINATOR_SLICE.len()
is 8 bytes.data: [u8; 31]
serializes to 31 bytes.Option<CompressionInfo>
serializes to 1 byte (tag) + 9 bytes (CompressionInfo::INIT_SPACE
= 8 bytes foru64
+ 1 byte for enum) = 10 bytes.Total: 8 + 31 + 1 + 9 = 49 bytes, matching the
8 + 31 + 1 + 9
calculation.sdk-tests/anchor-compressible/src/lib.rs (5)
27-68
: LGTM: Well-structured create_record instruction.The instruction properly validates config, sets account data, verifies rent recipient, and handles compression with appropriate error handling.
142-147
: Good validation for account count matching.The validation ensures PDAs, compressed accounts, and bumps arrays all have matching lengths before processing.
292-380
: Comprehensive multi-account compression implementation.The create_user_record_and_game_session instruction properly handles creating and compressing multiple accounts in a single transaction with appropriate CPI handling.
654-664
: LGTM: Proper account struct definition.The UserRecord struct correctly implements all required traits and includes proper field annotations for compression support.
428-428
: Space calculation forUserRecord
is correct.The breakdown matches Borsh serialization:
- 8 bytes account discriminator
- 1 byte
Option
tag + 8 byteslast_written_slot
+ 1 byteCompressionState
= 10 bytes- 32 bytes
Pubkey
- 4 bytes string length prefix + 32 bytes max-length
String
= 36 bytes- 8 bytes
u64 score
Total: 8 + 10 + 32 + 36 + 8 = 94 bytes (matches
8 + 32 + 4 + 32 + 8 + 10
). No changes required.
@@ -4,7 +4,17 @@ | |||
"license": "Apache-2.0", | |||
"description": "Test programs for Light Protocol uses test-sbf to build because build-sbf -- -p creates an infinite loop.", | |||
"scripts": { | |||
"build": "cargo test-sbf -p create-address-test-program" | |||
"build": "cargo test-sbf -p create-address-test-program", | |||
"test": "RUSTFLAGS=\"-D warnings\" && pnpm test-account-compression && pnpm test-system && pnpm test-registry && pnpm test-compressed-token && pnpm test-system-cpi && pnpm test-system-cpi-v2 && pnpm test-e2e && pnpm test-sdk-anchor && pnpm test-sdk-pinocchio", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
RUSTFLAGS
is not propagated to the test commands
In "test"
you currently do:
"test": "RUSTFLAGS=\"-D warnings\" && pnpm test-account-compression && …"
The first part (RUSTFLAGS="-D warnings"
) is just a shell variable assignment statement; it does not export the variable, so none of the subsequent pnpm …
invocations see it. As a result, the compiler warnings you intend to escalate to errors are silently ignored.
Proposed fix (POSIX-shell):
-"test": "RUSTFLAGS=\"-D warnings\" && pnpm test-account-compression && pnpm test-system && pnpm test-registry && pnpm test-compressed-token && pnpm test-system-cpi && pnpm test-system-cpi-v2 && pnpm test-e2e && pnpm test-sdk-anchor && pnpm test-sdk-pinocchio",
+"test": "export RUSTFLAGS=\"-D warnings\" && pnpm run test-account-compression && pnpm run test-system && pnpm run test-registry && pnpm run test-compressed-token && pnpm run test-system-cpi && pnpm run test-system-cpi-v2 && pnpm run test-e2e && pnpm run test-sdk-anchor && pnpm run test-sdk-pinocchio",
Cross-platform alternative (requires cross-env
):
"test": "cross-env RUSTFLAGS=\"-D warnings\" pnpm run test-account-compression && …"
Either ensures every downstream cargo test-sbf
sees -D warnings
.
🤖 Prompt for AI Agents
In program-tests/package.json at line 8, the RUSTFLAGS environment variable is
set but not exported, so it is not passed to the subsequent pnpm test commands.
To fix this, modify the test script to either export RUSTFLAGS before running
the tests or use a tool like cross-env to set RUSTFLAGS for all the pnpm test
commands in a cross-platform way, ensuring that the Rust compiler warnings are
treated as errors during all test runs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
sdk-libs/macros/src/compressible.rs (1)
548-553
: Consider more descriptive discriminator documentation.While the past review comment suggested improving the comment, the current explanation could still be clearer about the actual usage pattern.
The hardcoded zero discriminator needs better documentation. Consider this improvement:
impl light_sdk::LightDiscriminator for CompressedAccountVariant { - const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly + // Enum wrapper doesn't have its own discriminator - each variant delegates to its struct's discriminator + const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder value, not used for serialization const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; }
🧹 Nitpick comments (4)
sdk-libs/macros/src/compressible.rs (2)
101-105
: Improve glob import handling.The current glob import handling adds a special "*" marker but this approach has limitations:
- The marker isn't effectively used in type resolution
- Glob imports can't be properly resolved without knowing all available symbols
Consider either implementing proper glob resolution or documenting the limitation:
UseTree::Glob(_) => { - // For glob imports, we can't easily resolve specific items - // In this case, we'll add a special marker for the base path - import_info.add_import("*".to_string(), base_path); + // TODO: Glob imports are not fully supported - type resolution may fail + // for symbols imported via glob. Consider explicit imports for better macro support. + import_info.add_import("*".to_string(), base_path); }
381-390
: Improve parameter name detection patterns.The current parameter conversion logic only checks for specific naming patterns (
_data
,account_data
,params
), which might miss valid parameter names or incorrectly convert unrelated identifiers.Consider making the parameter detection more explicit or configurable:
// Check for various parameter naming patterns - if ident_str.ends_with("_data") - || ident_str == "account_data" - || ident_str == "params" + // TODO: Make parameter patterns configurable or use more robust detection + if is_likely_instruction_parameter(&ident_str) { // Replace with solana_account *field_expr.base = syn::parse_quote!(solana_account); self.converted = true; }Where
is_likely_instruction_parameter
could implement more sophisticated logic.sdk-tests/anchor-compressible-derived/src/state.rs (2)
5-5
: Consider derive macro ordering consistency.The derive macros on
UserRecord
are in a different order than onGameSession
. Consider standardizing the order for consistency across the codebase.Standardize derive macro ordering:
-#[derive(Debug, LightHasher, LightDiscriminator, HasCompressionInfo, Default, InitSpace)] +#[derive(Debug, LightHasher, LightDiscriminator, Default, InitSpace, HasCompressionInfo)]This matches the order used in
GameSession
.
24-32
: Consider session lifecycle field validation.The
GameSession
struct includes bothstart_time
andend_time
fields, withend_time
being optional. Consider adding validation logic to ensureend_time
is always greater thanstart_time
when present.Consider adding a validation method:
impl GameSession { pub fn validate_times(&self) -> Result<(), &'static str> { if let Some(end_time) = self.end_time { if end_time <= self.start_time { return Err("End time must be after start time"); } } Ok(()) } }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-tests/anchor-compressible-derived/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible-derived/src/constraints.rs
(1 hunks)sdk-tests/anchor-compressible-derived/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible-derived/src/state.rs
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- sdk-tests/anchor-compressible-derived/Cargo.toml
🚧 Files skipped from review as they are similar to previous changes (1)
- sdk-tests/anchor-compressible-derived/src/lib.rs
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: Forester e2e test
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test sdk-libs
- GitHub Check: Test program-libs-fast
- GitHub Check: Test program-libs-slow
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: lint
- GitHub Check: sdk-tests
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v1
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: stateless-js-v2
- GitHub Check: cli-v2
🔇 Additional comments (9)
sdk-libs/macros/src/compressible.rs (3)
514-521
: LGTM: Empty struct validation implemented.The validation for empty struct list has been properly implemented as requested in previous review comments. The error handling is clear and prevents panics.
724-727
: Robust error handling for account validation.The validation logic correctly checks that the number of PDA accounts, compressed accounts, and bump values all match, preventing potential array bounds errors.
954-976
: Well-structured trait derivation.The
HasCompressionInfo
trait derivation is well-implemented with proper field validation and clear error messages. The generated implementation correctly handles theOption<CompressionInfo>
field with appropriate unwrapping and mutation methods.sdk-tests/anchor-compressible-derived/src/constraints.rs (3)
15-16
: LGTM: Proper PDA seed configuration.The PDA seeds configuration using "user_record" and the user's public key is appropriate for creating unique user-specific accounts. The bump seed ensures deterministic address derivation.
19-24
: Good security practice with account validation comments.The comments clearly indicate that
rent_recipient
andconfig
accounts are validated externally (via config and load_checked respectively), which is a good security practice for documenting validation responsibilities.
12-14
: Space calculation confirmed.The
Option<CompressionInfo>
serialized by Anchor uses 1 byte for the presence flag plus 9 bytes of payload (8 bytes forlast_written_slot: u64
and 1 byte for theCompressionState
discriminant), totaling exactly 10 bytes. No changes required.sdk-tests/anchor-compressible-derived/src/state.rs (3)
8-9
: Proper compression info field configuration.The
compression_info
field is correctly marked with#[skip]
to exclude it from hashing, which is appropriate since compression metadata shouldn't affect the account's hash.
10-15
: Well-designed hashable fields for UserRecord.The field design is appropriate with
owner
andname
marked for hashing whilescore
is excluded. The#[max_len(32)]
constraint on the name field provides bounds checking for the string.
21-22
: Consistent compression info field usage.The
GameSession
struct correctly follows the same pattern asUserRecord
with thecompression_info
field marked as#[skip]
.
sdk-libs/macros/src/compressible.rs
Outdated
let light_account = light_sdk::account::LightAccount::<'_, #struct_names>::new_mut( | ||
&crate::ID, | ||
&compressed_data.meta, | ||
data, | ||
)?; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Verify LightAccount creation with correct type parameter.
The code creates LightAccount<'_, #struct_names>
but #struct_names
is a vector, not a single type. This could cause compilation errors.
The type parameter should match the specific struct type being processed:
// Create LightAccount with correct discriminator
- let light_account = light_sdk::account::LightAccount::<'_, #struct_names>::new_mut(
+ let light_account = light_sdk::account::LightAccount::<'_, _>::new_mut(
&crate::ID,
&compressed_data.meta,
data,
)?;
Or use the specific struct name from the match arm context.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
let light_account = light_sdk::account::LightAccount::<'_, #struct_names>::new_mut( | |
&crate::ID, | |
&compressed_data.meta, | |
data, | |
)?; | |
// Create LightAccount with correct discriminator | |
let light_account = light_sdk::account::LightAccount::<'_, _>::new_mut( | |
&crate::ID, | |
&compressed_data.meta, | |
data, | |
)?; |
🤖 Prompt for AI Agents
In sdk-libs/macros/src/compressible.rs around lines 758 to 762, the creation of
LightAccount uses a type parameter #struct_names which is a vector, not a single
type, causing compilation errors. Fix this by replacing #struct_names with the
specific struct type relevant to the current match arm or context, ensuring the
type parameter is a single concrete type rather than a vector.
let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent::<#struct_names>( | ||
&[&solana_accounts[i]], | ||
vec![light_account], | ||
&[seeds_refs.as_slice()], | ||
&cpi_accounts, | ||
&ctx.accounts.rent_payer, | ||
address_space, | ||
)?; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential type mismatch in prepare_accounts_for_decompress_idempotent.
Similar to the previous issue, the function call uses #struct_names
as a type parameter but should use the specific struct type from the current match arm.
Fix the type parameter to match the current variant:
// Process this single account
- let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent::<#struct_names>(
+ let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent::<_>(
&[&solana_accounts[i]],
vec![light_account],
&[seeds_refs.as_slice()],
&cpi_accounts,
&ctx.accounts.rent_payer,
address_space,
)?;
The compiler should be able to infer the correct type from the context.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent::<#struct_names>( | |
&[&solana_accounts[i]], | |
vec![light_account], | |
&[seeds_refs.as_slice()], | |
&cpi_accounts, | |
&ctx.accounts.rent_payer, | |
address_space, | |
)?; | |
let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent::<_>( | |
&[&solana_accounts[i]], | |
vec![light_account], | |
&[seeds_refs.as_slice()], | |
&cpi_accounts, | |
&ctx.accounts.rent_payer, | |
address_space, | |
)?; |
🤖 Prompt for AI Agents
In sdk-libs/macros/src/compressible.rs around lines 765 to 772, the call to
prepare_accounts_for_decompress_idempotent incorrectly uses the generic type
parameter #struct_names instead of the specific struct type for the current
match arm. Replace #struct_names with the exact struct type corresponding to the
current variant being processed so the compiler can correctly infer the type
from context.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (3)
sdk-libs/macros/src/compressible.rs (3)
784-787
: Discriminator comment could be clearer.The comment about the discriminator "won't be used directly" could be more descriptive to help future maintainers understand the purpose.
993-997
: Type parameter issue still exists.The
LightAccount
creation still uses#struct_names
as a type parameter, but#struct_names
is a vector of types, not a single type. This will cause compilation errors.
1000-1007
: Type parameter issue in prepare_accounts_for_decompress_idempotent.Similar to the previous issue, the function call incorrectly uses
#struct_names
as a type parameter instead of the specific struct type for the current match arm.
🧹 Nitpick comments (1)
sdk-libs/macros/src/compressible.rs (1)
298-313
: Consider making seed inference more configurable.The hardcoded seed inference pattern for
PoolState
is quite specific to AMM/DEX programs. Consider making this more generic or configurable to support different program types.You could add a configuration parameter or trait to define custom seed patterns:
+pub trait SeedInferencePattern { + fn infer_seeds(account_type: &str, import_info: &ImportInfo) -> Option<SeedInfo>; +} + fn try_infer_seeds_from_imports( account_type: &Ident, import_info: &ImportInfo, ) -> Result<Option<SeedInfo>> {
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
sdk-libs/macros/src/compressible.rs
(1 hunks)
🧰 Additional context used
🪛 GitHub Actions: lint
sdk-libs/macros/src/compressible.rs
[error] 1134-1134: Trailing whitespace detected at line 1134.
[error] 1142-1142: Trailing whitespace detected at line 1142.
[error] 1146-1146: Trailing whitespace detected at line 1146.
[error] 1153-1153: Trailing whitespace detected at line 1153.
[error] 1156-1156: Trailing whitespace detected at line 1156.
[error] 1160-1160: Trailing whitespace detected at line 1160.
[error] 1165-1165: Trailing whitespace detected at line 1165.
[error] rustfmt formatting failed due to trailing whitespace errors. Please remove trailing whitespace to fix formatting.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Forester e2e test
🔇 Additional comments (5)
sdk-libs/macros/src/compressible.rs (5)
369-373
: Good defensive programming with recursion depth limit.The depth tracking to prevent infinite recursion is a solid approach for handling complex nested generic types.
397-403
: Comprehensive container type support.The extensive list of container types (smart pointers, collections, interior mutability) shows thorough consideration of Rust's type system patterns.
749-754
: Validation for empty struct list is properly implemented.The validation check for empty struct names has been correctly addressed from the previous review comment.
1188-1244
: Well-structured derive implementation.The
derive_has_compression_info
function properly validates preconditions (named fields, presence of compression_info field) and generates a clean trait implementation. The error messages are descriptive and helpful.
1060-1100
: Excellent error messaging for missing seed information.The detailed error messages with specific solutions for different scenarios (imported structs, inline modules, external modules) provide excellent developer experience. The multi-solution approach with examples is very helpful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (13)
sdk-libs/sdk/src/compressible/decompress_idempotent.rs (1)
40-40
: Use slice reference instead of Vec to avoid heap allocation.We should avoid vectors for better efficiency.
- compressed_accounts: Vec<LightAccount<'_, T>>, + compressed_accounts: &[LightAccount<'_, T>],sdk-libs/macros/src/native_compressible.rs (2)
153-153
: Zero discriminator could cause conflicts with other account types.The enum uses
[0; 8]
as its discriminator which may conflict with account types that also use a zero discriminator.Consider using a unique discriminator:
- const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Default discriminator for enum + // Use a unique discriminator to avoid conflicts + const LIGHT_DISCRIMINATOR: [u8; 8] = [0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0x01];Alternatively, evaluate if the enum needs a discriminator since it's not directly serialized as an account.
391-391
: Document required constants for macro users.The generated code references
crate::LIGHT_CPI_SIGNER
andcrate::ID
which must be defined by the crate using this macro.Add documentation to the macro:
/// Generates native compressible instructions for the specified account types. /// /// # Required Constants /// /// The following constants must be defined in your crate: /// - `LIGHT_CPI_SIGNER`: The CPI signer for Light Protocol operations /// - `ID`: The program ID /// /// # Example /// ``` /// pub const LIGHT_CPI_SIGNER: Pubkey = /* ... */; /// pub const ID: Pubkey = /* ... */; /// /// #[native_compressible(MyAccount1, MyAccount2)] /// mod instructions { /// // Generated code will be inserted here /// } /// ```Also applies to: 503-503
sdk-libs/macros/src/compressible.rs (2)
850-855
: Clarify the purpose of hardcoded zero discriminator.The comment "This won't be used directly" is confusing and could lead to misunderstandings.
Consider a more descriptive comment:
impl light_sdk::LightDiscriminator for CompressedAccountVariant { - const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly + // The enum itself doesn't have a discriminator - each variant uses its struct's discriminator + const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder - actual discriminators come from variant structs const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; }
1134-1134
: Remove trailing whitespace to fix linting errors.Multiple lines contain trailing whitespace causing the pipeline to fail.
- "No account struct found with 'init' constraint and seeds for type '{}'.\n\n", + "No account struct found with 'init' constraint and seeds for type '{}'.\n\n", struct_name ); - + // Check if we have imported account structs - provide different guidance if !import_info.get_potential_account_structs().is_empty() { error_msg.push_str("DETECTED IMPORTED ACCOUNT STRUCTS:\n"); for account_struct in import_info.get_potential_account_structs() { error_msg.push_str(&format!(" - {}\n", account_struct)); } error_msg.push_str("\n"); - + error_msg.push_str("EXTERNAL FILE MODULE SOLUTIONS:\n"); error_msg.push_str("1. ADD EXPLICIT SEEDS STRUCT: Create a minimal seed definition in the same module:\n"); error_msg.push_str(&format!(" #[derive(Accounts)]\n pub struct {}Seeds<'info> {{\n #[account(\n init,\n seeds = [\n POOL_SEED.as_bytes(),\n amm_config.key().as_ref(),\n token_0_mint.key().as_ref(),\n token_1_mint.key().as_ref(),\n ],\n bump\n )]\n pub {}: Box<Account<'info, {}>>,\n pub amm_config: AccountInfo<'info>,\n pub token_0_mint: AccountInfo<'info>,\n pub token_1_mint: AccountInfo<'info>,\n }}\n\n", struct_name, struct_name.to_string().to_snake_case(), struct_name)); - + error_msg.push_str("2. CONVERT TO INLINE MODULE: Move your account struct to an inline module:\n"); error_msg.push_str(&format!(" pub mod instructions {{\n use super::*;\n #[derive(Accounts)]\n pub struct Initialize<'info> {{\n #[account(\n init,\n seeds = [...],\n bump\n )]\n pub {}: Box<Account<'info, {}>>,\n // ... other fields\n }}\n }}\n pub use instructions::*;\n\n", struct_name.to_string().to_snake_case(), struct_name)); } else { error_msg.push_str("COMMON SOLUTIONS:\n"); error_msg.push_str("1. INLINE MODULE DEFINITION: Define your account struct in an inline module within the same file:\n"); error_msg.push_str(&format!(" pub mod initialize {{\n use super::*;\n #[derive(Accounts)]\n pub struct Initialize<'info> {{\n #[account(\n init,\n seeds = [...],\n bump\n )]\n pub {}: Box<Account<'info, {}>>,\n // ... other fields\n }}\n }}\n pub use initialize::*;\n\n", struct_name.to_string().to_snake_case(), struct_name)); - + error_msg.push_str("2. MOVE TO SAME MODULE: Move your account struct to the same module where #[add_compressible_instructions] is applied:\n"); error_msg.push_str(&format!(" #[derive(Accounts)]\n pub struct Initialize<'info> {{\n #[account(\n init,\n seeds = [...],\n bump\n )]\n pub {}: Box<Account<'info, {}>>,\n // ... other fields\n }}\n\n", struct_name.to_string().to_snake_case(), struct_name)); - + error_msg.push_str("3. CREATE A MINIMAL SEED STRUCT: If you can't move the existing struct, create a minimal one:\n"); error_msg.push_str(&format!(" #[derive(Accounts)]\n pub struct {}Seeds<'info> {{\n #[account(\n init,\n seeds = [/* your seeds here */],\n bump\n )]\n pub {}: Box<Account<'info, {}>>,\n }}\n\n", struct_name, struct_name.to_string().to_snake_case(), struct_name)); } - + error_msg.push_str("TECHNICAL INFO:\n"); error_msg.push_str("✓ Wrapper types supported: Account, Box<Account>, Option<Account>, Arc<Account>\n"); error_msg.push_str("✓ Required attributes: #[account(init, seeds = [...], bump)] and #[derive(Accounts)]\n"); error_msg.push_str("✓ External file modules require explicit seed definitions due to proc macro limitations\n"); - +Also applies to: 1142-1142, 1146-1146, 1153-1153, 1156-1156, 1160-1160, 1165-1165
sdk-libs/sdk/src/account.rs (1)
196-196
: Address security review TODO for DEFAULT_DATA_HASH usage.The TODO comments indicate pending security review for
DEFAULT_DATA_HASH = [1; 32]
.Before removing these TODOs:
- Verify that using a fixed all-ones array doesn't introduce collision or replay risks in your Merkle/compression scheme
- Document the threat model and why this sentinel value is safe
- Consider if a different constant or approach would be more appropriate
Also update the comment in
sdk-libs/sdk-types/src/constants.rs
whereDEFAULT_DATA_HASH
is defined.Also applies to: 329-330
sdk-libs/macros/src/lib.rs (1)
5-5
: Import ofItemMod
is removed but still used in the fileThe import of
syn::ItemMod
has been removed from line 5, but it's still being used in multiple places throughout the file (lines 272, 297, 346). This will cause compilation errors.Add
ItemMod
back to the imports:-use syn::{parse_macro_input, DeriveInput, ItemStruct}; +use syn::{parse_macro_input, DeriveInput, ItemMod, ItemStruct};sdk-tests/native-compressible/src/decompress_dynamic_pda.rs (2)
95-98
: Use seeds from instruction data instead of hardcoded value.The function uses hardcoded seeds
b"dynamic_pda"
for PDA derivation, but should use the seeds provided incompressed_account_data.seeds
to support dynamic PDAs with different seed configurations.- // Derive PDA for verification using the provided bump - let seeds: Vec<&[u8]> = vec![b"dynamic_pda"]; + // Derive PDA for verification using the provided seeds + let seeds: Vec<&[u8]> = compressed_account_data.seeds + .iter() + .map(|s| s.as_slice()) + .collect(); let (derived_pda, expected_bump) = solana_program::pubkey::Pubkey::find_program_address(&seeds, &crate::ID);
143-146
: Make address space selection configurable instead of hardcoded.Using a hardcoded address space index could cause issues if the config has multiple address spaces or if tests expect different address spaces.
Consider passing the address space index as part of the instruction data or deriving it from the compressed accounts' tree info:
// Option 1: Add to instruction data struct pub struct DecompressMultipleInstructionData { pub proof: ValidityProof, pub compressed_accounts: Vec<CompressedAccountData<MyPdaAccount>>, pub bumps: Vec<u8>, pub system_accounts_offset: u8, pub address_space_index: u8, // New field } // Option 2: Derive from compressed accounts let address_space = compressed_accounts.first() .ok_or(LightSdkError::ConstraintViolation)? .tree_info .address_tree;sdk-tests/native-compressible/src/lib.rs (1)
54-55
: Replace panic with proper error handling.Using
panic!
in a Solana program is dangerous as it can cause the entire transaction to fail unexpectedly and consume all compute units.- _ => panic!("Invalid instruction discriminator."), + _ => Err(LightSdkError::InvalidInstructionData),This provides proper error handling that follows Solana best practices.
sdk-libs/sdk/src/compressible/compress_account_on_init.rs (3)
206-206
: Remove debug message.Debug messages should not be in production code.
- msg!("0 hi?");
276-284
: Remove debug messages from production code.Multiple debug messages should be removed as they clutter logs and may expose internal state.
- msg!("pda_accounts_info.len(): {:?}", pda_accounts_info.len()); - msg!("pda_accounts_data.len(): {:?}", pda_accounts_data.len()); - msg!("addresses.len(): {:?}", addresses.len()); - msg!("new_address_params.len(): {:?}", new_address_params.len()); - msg!( - "output_state_tree_indices.len(): {:?}", - output_state_tree_indices.len() - );
294-295
: Remove debug messages from production code.Debug messages should be removed as they clutter logs and may expose internal state.
- msg!("address tree: {:?}", tree); - msg!("expected address_space: {:?}", address_space);
🧹 Nitpick comments (4)
sdk-libs/macros/src/hasher/data_hasher.rs (1)
69-78
: Duplicate logic - consider extracting to a helper function.This block duplicates the field size truncation logic from lines 40-47. Consider extracting this into a helper function to maintain DRY principles.
+fn apply_field_size_truncation<H: ::light_hasher::Hasher>(mut result: [u8; 32]) -> [u8; 32] { + const POSEIDON_HASHER_ID: u8 = 0; + if H::ID != POSEIDON_HASHER_ID { + result[0] = 0; + } + result +} - let mut result = H::hashv(&[ + let result = H::hashv(&[ #(#data_hasher_assignments.as_slice(),)* ])?; - - // Apply field size truncation for non-Poseidon hashers - if H::ID != 0 { - result[0] = 0; - } - - Ok(result) + + Ok(apply_field_size_truncation::<H>(result))sdk-libs/macros/src/hasher/light_hasher.rs (1)
121-164
: Test coverage maintains backward compatibility.The existing test properly validates that the default
derive_light_hasher
function continues to generate Poseidon-based implementations, ensuring backward compatibility is preserved.Consider adding a test for the new SHA256 derive macro to ensure it generates the expected output with the SHA256 hasher:
#[test] fn test_light_hasher_sha() { let input: ItemStruct = parse_quote! { struct MyAccount { a: u32, b: i32, } }; let output = derive_light_hasher_sha(input).unwrap(); let formatted_output = unparse(&syn::parse2(output).unwrap()); // Verify that the generated code uses SHA256 instead of Poseidon assert!(formatted_output.contains("::light_hasher::Sha256")); assert!(!formatted_output.contains("::light_hasher::Poseidon")); }sdk-libs/macros/src/native_compressible.rs (1)
426-426
: Consider using ArrayVec for signer seeds to avoid heap allocations.Solana has a maximum of 16 seeds. Using ArrayVec would avoid heap operations.
+use arrayvec::ArrayVec; - signer_seeds_storage.push(seeds); + let seeds_array: ArrayVec<Vec<u8>, 16> = seeds.into_iter().collect(); + signer_seeds_storage.push(seeds_array.into_inner().unwrap());sdk-libs/sdk/src/account.rs (1)
333-335
: Document the purpose of zeroing first byte for non-zero hasher IDs.The code zeros the first byte of the data hash when the hasher ID is non-zero, but the reasoning isn't documented.
output.data_hash = self.account.hash::<H>()?; if H::ID != 0 { + // Zero the first byte for non-Poseidon hashers to distinguish hash types output.data_hash[0] = 0; }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (19)
program-libs/hasher/src/keccak.rs
(1 hunks)program-libs/hasher/src/lib.rs
(1 hunks)program-libs/hasher/src/poseidon.rs
(1 hunks)program-libs/hasher/src/sha256.rs
(1 hunks)sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-libs/macros/src/hasher/data_hasher.rs
(2 hunks)sdk-libs/macros/src/hasher/light_hasher.rs
(3 hunks)sdk-libs/macros/src/hasher/mod.rs
(1 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs
(2 hunks)sdk-libs/macros/src/lib.rs
(2 hunks)sdk-libs/macros/src/native_compressible.rs
(1 hunks)sdk-libs/sdk/src/account.rs
(7 hunks)sdk-libs/sdk/src/compressible/compress_account.rs
(1 hunks)sdk-libs/sdk/src/compressible/compress_account_on_init.rs
(1 hunks)sdk-libs/sdk/src/compressible/decompress_idempotent.rs
(1 hunks)sdk-libs/sdk/src/lib.rs
(2 hunks)sdk-tests/anchor-compressible/src/lib.rs
(1 hunks)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/lib.rs
(1 hunks)
✅ Files skipped from review due to trivial changes (3)
- program-libs/hasher/src/sha256.rs
- program-libs/hasher/src/poseidon.rs
- program-libs/hasher/src/keccak.rs
🚧 Files skipped from review as they are similar to previous changes (2)
- sdk-libs/sdk/src/compressible/compress_account.rs
- sdk-libs/sdk/src/lib.rs
🧰 Additional context used
🧬 Code Graph Analysis (4)
sdk-libs/macros/src/hasher/mod.rs (2)
sdk-libs/macros/src/lib.rs (1)
light_hasher
(188-194)sdk-libs/macros/src/hasher/light_hasher.rs (2)
derive_light_hasher
(51-53)derive_light_hasher_sha
(55-57)
sdk-libs/macros/src/hasher/data_hasher.rs (4)
program-libs/hasher/src/keccak.rs (1)
hashv
(18-20)program-libs/hasher/src/lib.rs (1)
hashv
(29-29)program-libs/hasher/src/poseidon.rs (1)
hashv
(87-127)program-libs/hasher/src/sha256.rs (1)
hashv
(17-43)
sdk-libs/sdk/src/compressible/decompress_idempotent.rs (4)
sdk-libs/sdk/src/account.rs (4)
address
(291-293)account
(332-332)size
(87-87)size
(308-313)sdk-tests/native-compressible/src/lib.rs (4)
account
(162-162)account
(165-165)compression_info
(111-115)size
(133-136)sdk-libs/sdk/src/compressible/compression_info.rs (2)
compression_info
(8-8)new_decompressed
(34-39)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs (1)
prepare_accounts_for_decompress_idempotent
(148-148)
sdk-libs/macros/src/lib.rs (4)
sdk-libs/macros/src/hasher/light_hasher.rs (7)
derive_light_hasher
(51-53)derive_light_hasher_sha
(55-57)syn
(161-161)syn
(210-210)syn
(289-289)syn
(346-346)syn
(395-395)sdk-libs/macros/src/native_compressible.rs (2)
syn
(46-46)add_native_compressible_instructions
(41-100)sdk-libs/macros/src/compressible.rs (2)
syn
(787-787)add_compressible_instructions
(783-1253)sdk-libs/macros/src/program.rs (1)
program
(290-301)
🪛 GitHub Actions: lint
sdk-libs/macros/src/compressible.rs
[error] 1134-1134: Trailing whitespace detected. Please remove trailing whitespace on this line.
[error] 1142-1142: Trailing whitespace detected. Please remove trailing whitespace on this line.
[error] 1146-1146: Trailing whitespace detected. Please remove trailing whitespace on this line.
[error] 1153-1153: Trailing whitespace detected. Please remove trailing whitespace on this line.
[error] 1156-1156: Trailing whitespace detected. Please remove trailing whitespace on this line.
[error] 1160-1160: Trailing whitespace detected. Please remove trailing whitespace on this line.
[error] 1165-1165: Trailing whitespace detected. Please remove trailing whitespace on this line.
sdk-tests/native-compressible/src/lib.rs
[warning] 138-144: rustfmt failed to format code properly. Please fix formatting issues in this file.
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Test batched-merkle-tree-simulate
🔇 Additional comments (38)
program-libs/hasher/src/lib.rs (1)
27-27
: AllHasher
implementations include the newID
constant
- Verified in program-libs/hasher/src/poseidon.rs (ID = 0)
- Verified in program-libs/hasher/src/sha256.rs (ID = 1)
- Verified in program-libs/hasher/src/keccak.rs (ID = 2)
No other
impl Hasher for …
found in the workspace.
Breaking change is confirmed and safely applied—approving.sdk-libs/macros/src/hasher/mod.rs (1)
7-7
: LGTM! Clean addition of SHA256 hasher support.The addition of
derive_light_hasher_sha
to the public exports properly exposes the new SHA256 hasher derive functionality, maintaining consistency with the existingderive_light_hasher
function.sdk-libs/macros/src/hasher/to_byte_array.rs (2)
7-13
: LGTM! Clean parameterization of hasher type.The function signature change properly parameterizes the hasher type, enabling support for multiple hasher algorithms in the generated code. The naming change clearly reflects the new capability.
48-48
: Good use of parameterized hasher in generated code.The replacement of the hardcoded
::light_hasher::Poseidon
with the parameterizedhasher
properly enables customizable hasher types in the generatedto_byte_array
implementation.sdk-libs/macros/src/hasher/light_hasher.rs (2)
51-57
: Excellent refactoring for multi-hasher support.The refactoring maintains backward compatibility while introducing SHA256 support through a clean delegation pattern. The separation of concerns between the public API functions and the internal implementation is well-designed.
59-101
: Well-structured internal implementation.The
derive_light_hasher_with_hasher
function properly parameterizes the hasher type while maintaining all existing logic. The delegation togenerate_to_byte_array_impl_with_hasher
with the hasher parameter correctly integrates with the updated function signature.sdk-libs/macros/src/lib.rs (9)
10-10
: LGTM!The new module additions for
compressible
andnative_compressible
are necessary to support the new compressible account macros being introduced.Also applies to: 14-14
157-194
: Excellent documentation improvements!The expanded documentation with comprehensive examples significantly improves developer experience. The example code demonstrates proper usage patterns and the hash attribute explanation is very helpful.
196-220
: Well-implemented SHA256 hasher variant!The new
LightHasherSha
derive macro is properly implemented with comprehensive documentation and follows the same pattern as the existingLightHasher
macro. This provides users with a choice between Poseidon and SHA256 hashing algorithms.
222-255
: Useful automation for compression info trait implementation!The
HasCompressionInfo
derive macro effectively automates trait implementation with clear documentation about requirements and proper example usage. The requirement for exactly onecompression_info
field with specific type is well-explained.
257-277
: Well-designed macro for Anchor compressible instructions!The
add_compressible_instructions
macro is well-documented with clear usage examples and important placement requirements. The design allows for flexible account type specification.Note: This macro uses
syn::ItemMod
at line 272, confirming the need for the ItemMod import fix mentioned earlier.
279-302
: Comprehensive native compressible instructions macro!The
add_native_compressible_instructions
macro is excellently documented, clearly explaining all generated components (unified data structures, instruction data structs, processor functions). This provides a complete solution for native Solana programs.
305-311
: LGTM!The conversion from derive to attribute macro for
account
is a reasonable design change that provides more flexibility in account processing.
313-327
: Improved documentation and naming!The updated documentation clearly explains that this macro derives the CPI signer PDA from a program ID string literal at compile time. The example usage is helpful and the name is more concise.
329-351
: Good conversion to attribute macro!The change from derive to attribute macro for
light_program
provides better flexibility for program module processing. The documentation with example usage is helpful.Note: This macro also uses
syn::ItemMod
at line 346, further confirming the need for the ItemMod import fix.sdk-tests/native-compressible/src/decompress_dynamic_pda.rs (3)
1-20
: Well-designed struct and imports!The
CompressedAccountData<T>
struct is properly generic and well-documented. The imports are appropriate for the functionality being implemented.
124-142
: Excellent handling of complex reference lifetimes!The signer seeds construction properly manages the complex lifetime requirements, correctly converting from owned vectors to the nested slice references required by the SDK. The staged approach with storage vectors and reference conversion is well-implemented.
147-177
: Solid CPI logic and struct definitions!The CPI invocation logic is correct and the additional struct definitions (
DecompressToPdaInstructionData
,MyCompressedAccount
) are well-designed for their intended use cases.sdk-tests/native-compressible/src/lib.rs (5)
1-27
: Well-organized imports and module structure!The imports cover all necessary functionality from the Light SDK, and the module organization provides clear separation of concerns for different PDA operations.
64-65
: Good error handling implementation!The instruction discriminator parsing correctly uses
map_err
to handle errors gracefully instead of usingunwrap()
, which prevents panics and follows Solana best practices.
67-97
: Clean instruction dispatch implementation!The match-based instruction dispatch clearly delegates to appropriate module functions and properly propagates errors. The pattern is maintainable and follows good Solana program practices.
100-137
: Well-designed account struct with proper trait implementations!The
MyPdaAccount
struct has appropriate derives and the manual trait implementations forHasCompressionInfo
andSize
are correct. The use ofexpect()
in trait methods is acceptable sincecompression_info
should always beSome
on-chain.
139-206
: Comprehensive hashing tests with formatting issue to fix!The test module effectively verifies the differences between Poseidon and SHA256 hashing algorithms and checks proper byte array conversion. However, the pipeline failure indicates rustfmt formatting issues that need to be addressed.
Please run
cargo fmt
to fix the formatting issues indicated by the pipeline failure.sdk-libs/sdk/src/compressible/compress_account_on_init.rs (5)
1-68
: Well-designed function architecture!The imports are comprehensive and appropriate. The single-account wrapper function properly delegates to the multi-account helper, following good code organization principles with proper Anchor feature gating.
70-130
: Excellent function documentation and validation!The multi-account preparation function has comprehensive documentation explaining all parameters and return values. The input validation and address space validation provide proper security checks.
131-171
: Correct account processing and cleanup!The account processing loop properly handles compression info initialization, compressed account creation, data cloning, and uses Anchor's built-in
close()
method for proper account cleanup.
300-340
: Proper native account processing!The native account processing loop correctly adapts the Anchor variant's logic for native Solana, properly using the manual
close()
function since Anchor's built-in account closing isn't available.
342-373
: Excellent native account closing implementation!The manual
close()
function properly implements Solana account closing procedures with correct lamport transfers, system program assignment, and zero-size reallocation. The error handling and borrow management are appropriate.sdk-tests/anchor-compressible/src/lib.rs (10)
1-27
: Well-organized imports and program setup!The imports comprehensively cover the necessary Light SDK functionality, and the program ID with CPI signer derivation is properly configured.
28-69
: Solid compression instruction implementation!The
create_record
instruction correctly implements the compression workflow with proper config validation, rent recipient verification, and appropriate error handling using custom error codes.
71-81
: Correct update instruction with proper compression info management!The update instruction properly handles field updates and correctly manages compression info by manually setting the last written slot.
84-125
: Good delegation to SDK config management!The compression config instructions properly delegate to SDK functions (
process_initialize_compression_config_checked
andprocess_update_compression_config
) with appropriate parameter passing, following good separation of concerns.
132-232
: Excellent complex decompression instruction!The
decompress_accounts_idempotent
instruction handles complex multi-account decompression with proper validation, variant-based dispatching, and correct CPI invocation. The implementation effectively manages different account types through the unified enum pattern.
235-291
: Consistent and correct game session creation!The
create_game_session
instruction follows the same solid pattern ascreate_record
with proper account initialization, config validation, and compression workflow.
294-381
: Advanced multi-account compression implementation!The
create_user_record_and_game_session
instruction excellently demonstrates batching multiple different account types in a single transaction, correctly using prepare functions and CPI batching with multiple new addresses.
388-416
: Proper existing account compression!The
compress_record
instruction correctly handles compression of existing accounts with proper config validation, rent recipient checks, and appropriate use of thecompress_account
function.
419-577
: Well-designed account contexts with proper constraints!The Anchor account context structs are well-defined with appropriate seeds, space calculations, ownership constraints, and proper documentation for unchecked accounts.
578-772
: Excellent account type system with unified enum!The account type definitions are well-structured with a unified
CompressedAccountVariant
enum that properly implements all necessary traits. The individual account structs have appropriate derives and the manual trait implementations correctly delegate to the underlying types.
let mut result = H::hashv(slices.as_slice())?; | ||
|
||
// Apply field size truncation for non-Poseidon hashers | ||
if H::ID != 0 { | ||
result[0] = 0; | ||
} | ||
|
||
Ok(result) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Clarify the field size truncation logic and improve code documentation.
The conditional zeroing of the first byte for non-Poseidon hashers needs better documentation. The magic number 0
for Poseidon ID and the rationale for this post-processing step should be explained.
Consider adding documentation and constants:
+ // Poseidon hasher uses ID 0 and doesn't require field size truncation
+ const POSEIDON_HASHER_ID: u8 = 0;
+
let mut result = H::hashv(slices.as_slice())?;
- // Apply field size truncation for non-Poseidon hashers
- if H::ID != 0 {
+ // Apply field size truncation for non-Poseidon hashers to ensure BN254 field compatibility
+ if H::ID != POSEIDON_HASHER_ID {
result[0] = 0;
}
Also, please clarify why field size truncation is needed for non-Poseidon hashers but not for Poseidon. This seems like an important architectural decision that should be documented.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
let mut result = H::hashv(slices.as_slice())?; | |
// Apply field size truncation for non-Poseidon hashers | |
if H::ID != 0 { | |
result[0] = 0; | |
} | |
Ok(result) | |
// Poseidon hasher uses ID 0 and doesn't require field size truncation | |
const POSEIDON_HASHER_ID: u8 = 0; | |
let mut result = H::hashv(slices.as_slice())?; | |
// Apply field size truncation for non-Poseidon hashers to ensure BN254 field compatibility | |
if H::ID != POSEIDON_HASHER_ID { | |
result[0] = 0; | |
} | |
Ok(result) |
🤖 Prompt for AI Agents
In sdk-libs/macros/src/hasher/data_hasher.rs around lines 40 to 47, the code
zeroes the first byte of the hash result for non-Poseidon hashers using a magic
number 0 for Poseidon ID without explanation. Add a constant to represent the
Poseidon hasher ID instead of using the literal 0, and include comments
explaining why this truncation is applied only for non-Poseidon hashers,
clarifying the architectural reasoning behind this post-processing step.
lint remove unused _output_account_info update lightdiscriminator macro
cdcf564
to
ebb048c
Compare
add compress_pda helper compress_pda compiling decompress_idempotent.rs wip wip decompress batch idempotent wip add compress_pda_new and compress_multiple_pdas_new native program with decompress done compress_dynamic, decompress_dynamic wip adding anchor testprogram uses sdk fix compilation wip experiment with procmacro skip SLOT check at compress_pda_new wip add_compressible_instructions() works for compress, idl gen works add proc macro decompress_multiple_pdas is working as it should fix decompress_idempotent impl rm expanded add remove_data force apps to pass the whole signer_seeds directly add compressible_config draft add create_config_unchecked and checked use config, add unified header struct (just last_written_slot) for now use hascompressioninfo and compressioninfo add config support to compressible macro add expanded.rs cleanup anchor-derived example add support for multiple address_trees per address space add support for multiple address_trees per address space update macro to multiple address_trees add test-sdk-derived program wip cleanup native macro-derive example wip fix compilation config tests working clean up test_config.rs testing add a separate anchor compress_pda_new version so we dont have redundant serde wip wip wip fix decompress_idempotent anchor add test with 2nd account decompress works with multiple different PDAs cleanup, remove anchor helper decompress_multiple_pdas, fix discriminator writes rm examples simplify add compress_multiple_new + test add test add test: compressing multiple accounts_new fix compress_pda and add test added test case: invalid compression cleanup cleanup fmt and lint for anchor-compressible-user program config ix helpers added tests working, added generic create_compress_account_instruction rm warnings added standardized decompress_multiple_accounts_idempotent client rm idl add cu logger util wip add test: decompress with accounts stored in 2 different v2 state trees, added 2nd tree impl. to program-test and cli, added util to xtask clean clean up librs add opt anchor program error conv for errors, fix compression_info usage with single account struct wip clean fmt comments wip sdk-tests working extend and fix sdk-test tests, cleanup csdk, anchor program, update macro clean clean clean data_hasher.rs clean derive program tests, renaming, better simulate_cu err handling renames, remove deadcode remove unused discriminator field from config account move variable length field to end of config account struct simplify anchor/borsh serde imports add DEFAULT_DATA_HASH to constants rename CompressionInfo::new() to CompressionInfo::new_decompressed() xtask, remove helper wip remove redundant owner_program param add compile time size() rename pda_account to solana_account fix doctest output.data correctly clean initconfig, updatteconfig accounts struct rename signer_seeds to solana_accounts__signer_seeds move programs to sdk-test dir move to light-compressible-client lint ci wip wip add check load config add sdk-tests to workspace rename sdk-tests program names fix ci fix asserts in tests fmt update pkg json for sdk-tests debug prints for ci in test_indexer fmt make macro robust, lint anchor-discriminator-compat feature for LightDiscriminator lint add test_discriminator test replace actionlint revert actionlint macro: allow flex imports wip fmt update macro add sha hash function support to LightAccount and LightHasher + tests DataHasher macro: explicit validation for sha
ebb048c
to
05746a1
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
🔭 Outside diff range comments (1)
sdk-libs/macros/src/lib.rs (1)
251-259
: Document DataHasher breaking change.The
DataHasher
derive macro now uses SHA256 internally instead of Poseidon (line 256). This is a breaking change that affects hashing behavior.Add a breaking change notice in the documentation:
/// Alias of `LightHasher`. +/// +/// **Breaking Change**: As of version X.Y.Z, DataHasher uses SHA256 hashing +/// internally. For Poseidon hashing, use `LightHasher` directly. #[proc_macro_derive(DataHasher, attributes(skip, hash))]Consider whether this should maintain backward compatibility by using Poseidon instead.
♻️ Duplicate comments (15)
.github/workflows/sdk-tests.yml (1)
30-90
: Add explicit permissions block to limit GITHUB_TOKEN scopeThis workflow should explicitly define permissions to follow security best practices and limit the GITHUB_TOKEN scope.
sdk-libs/macros/src/native_compressible.rs (2)
153-153
: Zero discriminator for enum could cause conflictsThe
CompressedAccountVariant
enum uses[0; 8]
as its discriminator, which could conflict with other account types.
391-391
: Document required constants for macro usersThe generated code references
crate::LIGHT_CPI_SIGNER
andcrate::ID
which must be defined by the crate using this macro. This requirement should be documented.Also applies to: 503-503
sdk-tests/anchor-compressible/tests/test_instruction_builders.rs (1)
227-229
: Verify instruction data format assumption.The test assumes a specific instruction data format with hardcoded bytes
[7, 0, 2, 0, 0, 0, 0, 0]
. This makes the test brittle if the instruction format changes.sdk-tests/native-compressible/src/create_dynamic_pda.rs (2)
27-32
: Add bounds checking for accounts array access.The function accesses accounts array indices 0-4 without checking if sufficient accounts are provided. This could cause a panic if fewer than 5 accounts are passed.
110-110
: Validate remaining accounts length before accessing.The CpiAccounts constructor accesses
accounts[5..]
without ensuring enough accounts exist.sdk-libs/sdk/src/account.rs (1)
196-196
: Address DEFAULT_DATA_HASH security TODOs.The TODO comments about reviewing security for
DEFAULT_DATA_HASH
are still present from previous reviews.These TODOs should be resolved by:
- Documenting the security analysis in the code
- Confirming DEFAULT_DATA_HASH ([1; 32]) doesn't introduce collision risks
- Removing the TODO comments once verified
#!/bin/bash # Check if DEFAULT_DATA_HASH is used consistently across the codebase rg -A 2 -B 2 "DEFAULT_DATA_HASH" --type rustAlso applies to: 331-331
sdk-libs/sdk/src/compressible/config.rs (1)
34-34
: Consider simplifying address_space to single Pubkey.Since the code enforces exactly 1 address tree throughout, using
Vec<Pubkey>
adds unnecessary complexity.The validation in multiple places confirms only 1 pubkey is allowed, so the Vec wrapper provides no benefit and complicates the API.
Also applies to: 81-87, 179-185
sdk-tests/anchor-compressible-derived/src/lib.rs (1)
26-26
: Incomplete re-exports for macro-generated instructions.The
add_compressible_instructions
macro generates several instruction handlers and account structs, but onlyCompressedAccountData
andCompressedAccountVariant
are re-exported. This limits client access to the full API.sdk-tests/native-compressible/src/lib.rs (2)
54-55
: Replace panic with proper error handling.Using
panic!
in a Solana program can cause transaction failure and consume all compute units.
64-65
: Add bounds check for instruction data.The code accesses
instruction_data[0]
without checking if the slice is empty, which could cause a panic.sdk-libs/sdk/src/compressible/compress_account_on_init.rs (2)
206-206
: Remove debug message.Debug messages should not be in production code.
276-284
: Remove all debug messages from production code.Multiple debug messages are present that should be removed to avoid cluttering logs and exposing internal state.
Also applies to: 294-295
sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (1)
523-524
: Remove duplicate CU simulation.The compute unit simulation is performed twice with identical parameters.
sdk-libs/light-compressible-client/src/lib.rs (1)
346-358
: Address TODO: Refactor tree info matching logicThe TODO comment correctly identifies the need for a cleaner implementation. This duplicates the logic from
compress_account
method. As suggested above, extract this into a shared helper method.Would you like me to create an issue to track this refactoring task?
🧹 Nitpick comments (9)
sdk-tests/native-compressible/src/create_dynamic_pda.rs (1)
49-50
: Consider making PDA seed configurable.The PDA seed is hardcoded as
"dynamic_pda"
. For better flexibility and reusability, consider passing the seed as a parameter or through the instruction data.- let seed_data = b"dynamic_pda"; // You can customize this based on your needs - let (derived_pda, bump_seed) = Pubkey::find_program_address(&[seed_data], &crate::ID); + // Use a configurable seed from instruction data or a parameter + let seed_data = instruction_data.pda_seed.as_bytes(); // Add pda_seed to CreateDynamicPdaInstructionData + let (derived_pda, bump_seed) = Pubkey::find_program_address(&[seed_data], &crate::ID);sdk-libs/macros/src/EXAMPLE_USAGE.md (1)
186-193
: Add language specifiers to fenced code blocks.The account layout documentation blocks should specify a language for better syntax highlighting and clarity.
-``` +```text 0. [writable, signer] Payer account 1. [writable] Config PDA (seeds: [b"compressible_config"]) 2. [] Program data account 3. [signer] Program upgrade authority 4. [] System program -``` +```Apply the same change to the other two code blocks (lines 196 and 207).
Also applies to: 196-204, 207-214
sdk-libs/macros/src/compressible.rs (1)
152-165
: Extract duplicated account pattern matching logic.The logic for checking potential account patterns is duplicated between the
UseTree::Name
andUseTree::Rename
branches.Extract this into a helper function:
+fn is_potential_account_struct(name: &str) -> bool { + const POTENTIAL_ACCOUNT_PATTERNS: &[&str] = &[ + "Initialize", "Create", "Update", "Deposit", "Withdraw", "Swap", + ]; + POTENTIAL_ACCOUNT_PATTERNS.iter().any(|&pattern| name.contains(pattern)) +} UseTree::Name(use_name) => { let local_name = use_name.ident.to_string(); // ... existing code ... import_info.add_import(local_name.clone(), full_path); - let potential_account_patterns = [ - "Initialize", - "Create", - "Update", - "Deposit", - "Withdraw", - "Swap", - ]; - if potential_account_patterns - .iter() - .any(|&pattern| local_name.contains(pattern)) - { + if is_potential_account_struct(&local_name) { import_info.add_potential_account_struct(local_name); } }Also applies to: 177-190
sdk-libs/sdk/src/compressible/config.rs (1)
357-357
: Define constant for ProgramData discriminator.The magic number
3
for ProgramData discriminator should be a named constant.+const PROGRAM_DATA_DISCRIMINATOR: u32 = 3; + // Check discriminator (should be 3 for ProgramData) let discriminator = u32::from_le_bytes([data[0], data[1], data[2], data[3]]); -if discriminator != 3 { +if discriminator != PROGRAM_DATA_DISCRIMINATOR { msg!("Invalid program data discriminator"); return Err(LightSdkError::ConstraintViolation.into()); }sdk-tests/anchor-compressible-derived/src/lib.rs (2)
55-55
: Consider making the initial score configurable.The score is hardcoded to
11
. Consider making this a parameter or a named constant for better maintainability.- user_record.score = 11; + const INITIAL_SCORE: u64 = 11; + user_record.score = INITIAL_SCORE;
127-127
: Use the same constant for initial score.This hardcoded score value should use the same constant suggested earlier for consistency.
- user_record.score = 11; + user_record.score = INITIAL_SCORE;sdk-tests/native-compressible/tests/test_compressible_flow.rs (1)
62-73
: Remove redundant PDA and address derivations.The PDA and compressed address are derived here but not used, as
create_and_compress_account
performs the same derivations internally.- let test_data = [1u8; 31]; - - let seeds: &[&[u8]] = &[b"dynamic_pda"]; - let (pda_pubkey, _bump) = Pubkey::find_program_address(seeds, &native_compressible::ID); - - let address_tree_pubkey = rpc.get_address_merkle_tree_v2(); - - let compressed_address = derive_address( - &pda_pubkey.to_bytes(), - &address_tree_pubkey.to_bytes(), - &native_compressible::ID.to_bytes(), - ); - - let pda_pubkey = create_and_compress_account(&mut rpc, &payer, test_data).await; + let test_data = [1u8; 31]; + let pda_pubkey = create_and_compress_account(&mut rpc, &payer, test_data).await; + + let address_tree_pubkey = rpc.get_address_merkle_tree_v2(); + let compressed_address = derive_address( + &pda_pubkey.to_bytes(), + &address_tree_pubkey.to_bytes(), + &native_compressible::ID.to_bytes(), + );sdk-tests/anchor-compressible/src/lib.rs (2)
28-69
: Well-implemented account creation with proper validation.The instruction correctly:
- Validates compression config and rent recipient
- Sets up CPI accounts properly
- Uses
compress_account_on_init
for safe compressionConsider making the hardcoded score value (line 43) configurable via instruction parameters for better flexibility.
132-232
: Excellent idempotent decompression implementation with multi-type support.The handler correctly:
- Validates input lengths match (lines 144-148)
- Handles both account variants via pattern matching
- Processes each account type with appropriate discriminators
- Uses proper CPI invocation
The duplicate seed processing logic (lines 171-175, 198-202) could be extracted into a helper function for better maintainability.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
Cargo.lock
is excluded by!**/*.lock
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (91)
.github/workflows/ci-lint.yml
(1 hunks).github/workflows/light-examples-tests.yml
(2 hunks).github/workflows/sdk-tests.yml
(1 hunks)Cargo.toml
(4 hunks)cli/accounts/batch_state_merkle_tree_2_2Yb3fGo2E9aWLjY8KuESaqurYpGGhEeJr7eynKrSgXwS.json
(1 hunks)cli/accounts/batched_output_queue_2_12wJT3xYd46rtjeqDU6CrtT8unqLjPiheggzqhN9YsyB.json
(1 hunks)cli/accounts/cpi_context_batched_2_HwtjxDvFEXiWnzeMeWkMBzpQN45A95rTJNZmz1Z3pe8R.json
(1 hunks)pnpm-workspace.yaml
(1 hunks)program-libs/hasher/src/keccak.rs
(1 hunks)program-libs/hasher/src/lib.rs
(1 hunks)program-libs/hasher/src/poseidon.rs
(1 hunks)program-libs/hasher/src/sha256.rs
(1 hunks)program-tests/package.json
(1 hunks)program-tests/sdk-anchor-test/Anchor.toml
(1 hunks)program-tests/sdk-anchor-test/package.json
(2 hunks)program-tests/sdk-test/src/lib.rs
(0 hunks)program-tests/sdk-test/tests/test.rs
(0 hunks)program-tests/system-cpi-test/tests/test_program_owned_trees.rs
(1 hunks)program-tests/utils/src/test_keypairs.rs
(2 hunks)programs/package.json
(1 hunks)scripts/format.sh
(1 hunks)sdk-libs/client/Cargo.toml
(2 hunks)sdk-libs/client/src/indexer/tree_info.rs
(1 hunks)sdk-libs/client/src/lib.rs
(1 hunks)sdk-libs/client/src/rpc/client.rs
(1 hunks)sdk-libs/light-compressible-client/Cargo.toml
(1 hunks)sdk-libs/light-compressible-client/src/lib.rs
(1 hunks)sdk-libs/macros/CHANGELOG.md
(1 hunks)sdk-libs/macros/Cargo.toml
(1 hunks)sdk-libs/macros/src/EXAMPLE_USAGE.md
(1 hunks)sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-libs/macros/src/cpi_signer.rs
(1 hunks)sdk-libs/macros/src/discriminator.rs
(2 hunks)sdk-libs/macros/src/hasher/data_hasher.rs
(2 hunks)sdk-libs/macros/src/hasher/input_validator.rs
(1 hunks)sdk-libs/macros/src/hasher/light_hasher.rs
(5 hunks)sdk-libs/macros/src/hasher/mod.rs
(1 hunks)sdk-libs/macros/src/hasher/to_byte_array.rs
(2 hunks)sdk-libs/macros/src/lib.rs
(4 hunks)sdk-libs/macros/src/native_compressible.rs
(1 hunks)sdk-libs/program-test/Cargo.toml
(1 hunks)sdk-libs/program-test/src/accounts/initialize.rs
(2 hunks)sdk-libs/program-test/src/accounts/test_accounts.rs
(2 hunks)sdk-libs/program-test/src/accounts/test_keypairs.rs
(3 hunks)sdk-libs/program-test/src/indexer/test_indexer.rs
(3 hunks)sdk-libs/program-test/src/lib.rs
(1 hunks)sdk-libs/program-test/src/program_test/compressible_setup.rs
(1 hunks)sdk-libs/program-test/src/program_test/mod.rs
(2 hunks)sdk-libs/program-test/src/utils/mod.rs
(1 hunks)sdk-libs/program-test/src/utils/simulation.rs
(1 hunks)sdk-libs/sdk-types/src/constants.rs
(1 hunks)sdk-libs/sdk/Cargo.toml
(2 hunks)sdk-libs/sdk/src/account.rs
(9 hunks)sdk-libs/sdk/src/compressible/compress_account.rs
(1 hunks)sdk-libs/sdk/src/compressible/compress_account_on_init.rs
(1 hunks)sdk-libs/sdk/src/compressible/compression_info.rs
(1 hunks)sdk-libs/sdk/src/compressible/config.rs
(1 hunks)sdk-libs/sdk/src/compressible/decompress_idempotent.rs
(1 hunks)sdk-libs/sdk/src/compressible/mod.rs
(1 hunks)sdk-libs/sdk/src/error.rs
(1 hunks)sdk-libs/sdk/src/lib.rs
(2 hunks)sdk-tests/anchor-compressible-derived/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible-derived/README.md
(1 hunks)sdk-tests/anchor-compressible-derived/Xargo.toml
(1 hunks)sdk-tests/anchor-compressible-derived/src/constraints.rs
(1 hunks)sdk-tests/anchor-compressible-derived/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible-derived/src/state.rs
(1 hunks)sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/anchor-compressible/CONFIG.md
(1 hunks)sdk-tests/anchor-compressible/Cargo.toml
(1 hunks)sdk-tests/anchor-compressible/Xargo.toml
(1 hunks)sdk-tests/anchor-compressible/src/lib.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_config.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_discriminator.rs
(1 hunks)sdk-tests/anchor-compressible/tests/test_instruction_builders.rs
(1 hunks)sdk-tests/native-compressible/Cargo.toml
(4 hunks)sdk-tests/native-compressible/src/compress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/create_config.rs
(1 hunks)sdk-tests/native-compressible/src/create_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/create_pda.rs
(2 hunks)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
(1 hunks)sdk-tests/native-compressible/src/lib.rs
(1 hunks)sdk-tests/native-compressible/src/update_config.rs
(1 hunks)sdk-tests/native-compressible/src/update_pda.rs
(2 hunks)sdk-tests/native-compressible/tests/test_compressible_flow.rs
(1 hunks)sdk-tests/native-compressible/tests/test_config.rs
(1 hunks)sdk-tests/package.json
(1 hunks)xtask/Cargo.toml
(1 hunks)xtask/src/create_batch_state_tree.rs
(1 hunks)xtask/src/new_deployment.rs
(1 hunks)
💤 Files with no reviewable changes (2)
- program-tests/sdk-test/src/lib.rs
- program-tests/sdk-test/tests/test.rs
✅ Files skipped from review due to trivial changes (11)
- xtask/Cargo.toml
- pnpm-workspace.yaml
- sdk-libs/program-test/Cargo.toml
- scripts/format.sh
- programs/package.json
- program-tests/system-cpi-test/tests/test_program_owned_trees.rs
- cli/accounts/cpi_context_batched_2_HwtjxDvFEXiWnzeMeWkMBzpQN45A95rTJNZmz1Z3pe8R.json
- sdk-tests/package.json
- program-tests/package.json
- sdk-tests/anchor-compressible-derived/README.md
- cli/accounts/batched_output_queue_2_12wJT3xYd46rtjeqDU6CrtT8unqLjPiheggzqhN9YsyB.json
🚧 Files skipped from review as they are similar to previous changes (57)
- .github/workflows/ci-lint.yml
- program-tests/sdk-anchor-test/package.json
- program-tests/sdk-anchor-test/Anchor.toml
- program-libs/hasher/src/lib.rs
- program-libs/hasher/src/sha256.rs
- program-libs/hasher/src/keccak.rs
- program-libs/hasher/src/poseidon.rs
- sdk-tests/anchor-compressible/Xargo.toml
- sdk-libs/client/src/lib.rs
- sdk-libs/sdk/Cargo.toml
- sdk-libs/sdk-types/src/constants.rs
- sdk-libs/macros/Cargo.toml
- xtask/src/create_batch_state_tree.rs
- sdk-tests/anchor-compressible/tests/test_discriminator.rs
- sdk-libs/client/src/rpc/client.rs
- sdk-libs/sdk/src/error.rs
- sdk-libs/program-test/src/utils/mod.rs
- sdk-tests/native-compressible/src/update_pda.rs
- sdk-libs/macros/src/hasher/mod.rs
- sdk-libs/program-test/src/lib.rs
- sdk-libs/macros/src/hasher/input_validator.rs
- sdk-libs/program-test/src/program_test/mod.rs
- xtask/src/new_deployment.rs
- sdk-tests/anchor-compressible-derived/Xargo.toml
- sdk-libs/program-test/src/accounts/test_accounts.rs
- sdk-tests/native-compressible/src/create_pda.rs
- sdk-libs/light-compressible-client/Cargo.toml
- .github/workflows/light-examples-tests.yml
- sdk-libs/client/src/indexer/tree_info.rs
- sdk-libs/program-test/src/accounts/initialize.rs
- sdk-libs/macros/src/cpi_signer.rs
- sdk-libs/client/Cargo.toml
- sdk-tests/anchor-compressible/CONFIG.md
- Cargo.toml
- sdk-libs/program-test/src/utils/simulation.rs
- sdk-tests/anchor-compressible-derived/src/constraints.rs
- sdk-libs/program-test/src/accounts/test_keypairs.rs
- sdk-tests/native-compressible/src/update_config.rs
- sdk-libs/macros/src/hasher/data_hasher.rs
- sdk-libs/program-test/src/indexer/test_indexer.rs
- sdk-tests/anchor-compressible-derived/src/state.rs
- cli/accounts/batch_state_merkle_tree_2_2Yb3fGo2E9aWLjY8KuESaqurYpGGhEeJr7eynKrSgXwS.json
- sdk-libs/program-test/src/program_test/compressible_setup.rs
- sdk-tests/native-compressible/src/compress_dynamic_pda.rs
- program-tests/utils/src/test_keypairs.rs
- sdk-libs/sdk/src/lib.rs
- sdk-tests/anchor-compressible/Cargo.toml
- sdk-tests/native-compressible/tests/test_config.rs
- sdk-tests/native-compressible/Cargo.toml
- sdk-libs/macros/src/discriminator.rs
- sdk-libs/sdk/src/compressible/mod.rs
- sdk-libs/sdk/src/compressible/compression_info.rs
- sdk-libs/sdk/src/compressible/compress_account.rs
- sdk-tests/native-compressible/src/create_config.rs
- sdk-tests/native-compressible/src/decompress_dynamic_pda.rs
- sdk-tests/anchor-compressible-derived/Cargo.toml
- sdk-tests/anchor-compressible-derived/tests/test_decompress_multiple.rs
🧰 Additional context used
🧬 Code Graph Analysis (1)
sdk-libs/macros/src/compressible.rs (13)
sdk-libs/macros/src/native_compressible.rs (2)
syn
(46-46)parse
(16-37)sdk-tests/native-compressible/src/create_dynamic_pda.rs (1)
std
(63-63)js/compressed-token/src/constants.ts (1)
POOL_SEED
(2-2)sdk-libs/macros/src/lib.rs (2)
add_compressible_instructions
(310-316)account
(344-350)sdk-libs/sdk/src/compressible/compression_info.rs (4)
compression_info
(8-8)compression_info_mut
(9-9)compression_info_mut_opt
(10-10)set_compression_info_none
(11-11)sdk-tests/native-compressible/src/lib.rs (7)
compression_info
(111-115)compression_info_mut
(117-121)compression_info_mut_opt
(123-125)set_compression_info_none
(127-129)size
(133-136)account
(162-162)account
(165-165)sdk-libs/sdk/src/account.rs (3)
size
(87-87)size
(309-314)account
(333-333)sdk-libs/program-test/src/program_test/compressible_setup.rs (2)
initialize_compression_config
(80-117)update_compression_config
(137-161)sdk-libs/sdk/src/compressible/config.rs (3)
process_initialize_compression_config_checked
(413-448)process_update_compression_config
(261-320)load_checked
(97-125)sdk-libs/sdk/src/compressible/decompress_idempotent.rs (1)
prepare_accounts_for_decompress_idempotent
(38-152)sdk-tests/native-compressible/src/decompress_dynamic_pda.rs (1)
prepare_accounts_for_decompress_idempotent
(148-148)sdk-libs/sdk/src/compressible/compress_account.rs (1)
compress_account
(38-90)sdk-libs/sdk/src/error.rs (4)
from
(90-92)from
(97-100)from
(104-131)from
(135-175)
🪛 GitHub Check: CodeQL
.github/workflows/sdk-tests.yml
[warning] 32-89: Workflow does not contain permissions
Actions job or workflow does not limit the permissions of the GITHUB_TOKEN. Consider setting an explicit permissions block, using the following as a minimal starting point: {{contents: read}}
🪛 markdownlint-cli2 (0.17.2)
sdk-libs/macros/src/EXAMPLE_USAGE.md
186-186: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
196-196: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
207-207: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: cli-v2
- GitHub Check: Test sdk-libs
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test program-libs-slow
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: Test program-libs-fast
- GitHub Check: Forester e2e test
- GitHub Check: stateless-js-v1
- GitHub Check: lint
- GitHub Check: cli-v1
- GitHub Check: stateless-js-v2
- GitHub Check: sdk-tests
🔇 Additional comments (45)
sdk-libs/sdk/src/compressible/decompress_idempotent.rs (1)
38-152
: LGTM!The idempotent decompression implementation is well-structured with proper validation, error handling, and addresses the previous size calculation concerns.
sdk-libs/macros/src/hasher/to_byte_array.rs (1)
7-90
: Well-structured hasher abstractionThe refactoring to support multiple hashers is clean and maintains backward compatibility. The SHA256 variant provides a simpler alternative by serializing the entire struct.
sdk-tests/anchor-compressible/tests/test_config.rs (1)
1-628
: Excellent test coverage for compression configThe test suite provides comprehensive coverage of configuration management including:
- Authority enforcement
- Re-initialization prevention
- Address space constraints
- Discriminator corruption detection
- Program data validation
The security-focused tests are particularly valuable for ensuring robustness.
sdk-tests/native-compressible/src/create_dynamic_pda.rs (1)
87-90
: Fix incorrect error mapping for invoke_signed.The error from
invoke_signed
is correctly mapped toLightSdkError::ProgramError(e)
now, which is appropriate. This issue has been resolved.sdk-libs/macros/src/hasher/light_hasher.rs (2)
55-76
: Well-implemented SHA256 hasher variant.The SHA256 implementation correctly:
- Uses SHA256-specific validation without field count limits
- Generates appropriate ToByteArray and DataHasher implementations
- Follows the same pattern as the Poseidon variant for consistency
This provides good flexibility for structs that exceed Poseidon's limitations.
441-712
: Excellent test coverage for SHA256 hasher.The test suite comprehensively covers:
- Large struct handling (>12 fields) that Poseidon cannot support
- Pubkey handling without requiring
#[hash]
attributes- Integration with SHA256 discriminator
- Real-world complex struct scenarios
- Clear comparison between SHA256 and Poseidon approaches
The tests effectively demonstrate when SHA256 is the appropriate choice and validate the implementation thoroughly.
sdk-libs/macros/CHANGELOG.md (1)
1-94
: Well-documented breaking changes with clear migration path.The changelog effectively communicates:
- The rationale for removing auto-generated create instructions
- New features added (config management, HasCompressionInfo trait)
- Comprehensive migration guide with practical code examples
- Clear before/after comparisons
This will help developers smoothly transition to the new patterns.
sdk-libs/macros/src/EXAMPLE_USAGE.md (1)
1-277
: Excellent documentation for native compressible instructions.This usage guide provides:
- Clear design philosophy explaining the manual dispatch approach
- Multiple implementation patterns (enum and constants-based)
- Comprehensive documentation of generated types and functions
- Practical client-side usage examples
- Well-structured benefits section
The flexibility offered by this approach allows developers to maintain full control over their instruction routing while benefiting from the macro's boilerplate generation.
sdk-tests/anchor-compressible-derived/src/lib.rs (4)
85-103
: LGTM! Proper compression info update handling.The manual update of the last written slot is correctly implemented with appropriate error handling.
159-196
: Well-structured multi-account compression implementation.The approach of preparing accounts separately and combining them for a single CPI call is efficient and follows best practices.
208-239
: Properly structured accounts with clear documentation.Good use of comments to explain space calculations and validation approach for unchecked accounts.
254-277
: Clean error handling and data structures.The error code and serialization structs are well-defined and follow Anchor conventions.
sdk-tests/native-compressible/tests/test_compressible_flow.rs (4)
27-29
: Well-defined test constants.Good practice defining RENT_RECIPIENT and COMPRESSION_DELAY as constants for test clarity.
110-201
: Well-structured helper function for account creation and compression.The function properly handles PDA derivation, validity proof retrieval, and instruction building for the native program.
203-265
: Correct implementation of decompression logic.The function properly retrieves the compressed account and uses the SDK's instruction builder for decompression.
325-391
: Comprehensive verification helpers with proper assertions.The verification functions thoroughly check account states, compression info, and data integrity at each stage of the compression lifecycle.
sdk-tests/native-compressible/src/lib.rs (3)
24-30
: Properly configured program constants.The program ID and CPI signer are correctly defined using Light SDK macros.
100-138
: Well-implemented compressible account structure.The
MyPdaAccount
correctly implements required traits with proper handling of the skipped compression_info field in size calculations.
139-284
: Comprehensive test coverage for SHA256 hashing.The tests thoroughly verify SHA256 functionality, including edge cases like large structs that exceed Poseidon's field limits.
sdk-libs/sdk/src/compressible/compress_account_on_init.rs (3)
26-68
: Clean wrapper implementation for single account compression.The function properly delegates to the multi-account helper and handles the CPI invocation.
92-171
: Well-structured multi-account compression with proper validation.The function correctly validates inputs, manages compression state, and handles account closing. The pattern of preparing multiple accounts for batch CPI is efficient.
343-370
: Correct account closing implementation.The function properly handles lamport transfer, system program assignment, and reallocation following Solana best practices.
sdk-tests/anchor-compressible/tests/test_decompress_multiple.rs (6)
27-28
: Well-defined test constants.Good practice using constants for ADDRESS_SPACE and RENT_RECIPIENT for test consistency.
30-142
: Comprehensive test for multi-account creation and decompression.The test thoroughly covers individual and combined account creation flows with proper assertions.
1020-1141
: Excellent security test for idempotent decompression.This test properly verifies that repeated decompression attempts don't corrupt account state, preventing potential attack vectors.
1143-1222
: Good edge case coverage for mixed state tree sources.The test validates that decompression correctly handles accounts from different state trees in a single operation.
1224-1324
: Thorough test of compression info updates.The test properly verifies that
last_written_slot
is correctly updated when account data changes.
204-411
: Well-structured test helper functions.The helper functions provide clean abstractions for complex operations like validity proof retrieval and tree info packing, making tests more readable and maintainable.
Also applies to: 612-1018
sdk-tests/anchor-compressible/src/lib.rs (12)
1-21
: LGTM: Well-structured imports and CPI signer setup.The imports are comprehensive and appropriate for a compressible account program. The
LIGHT_CPI_SIGNER
derivation follows the correct pattern for Light Protocol integration.
71-81
: Correct implementation with manual compression info handling.The manual setting of
last_written_slot
is correctly implemented. The comment explaining the necessity is helpful for maintainability.
84-125
: LGTM: Proper delegation to SDK compression config functions.Both config initialization and update handlers correctly delegate to the SDK functions with appropriate parameter mapping. The comments about macro derivation suggest good future automation potential.
235-291
: LGTM: Consistent game session creation following secure patterns.The implementation correctly mirrors the
create_record
pattern with appropriate game session-specific fields. UsingClock::get()
for timestamping is proper for tracking session start times.
294-381
: Sophisticated multi-account creation with proper atomic handling.This advanced instruction correctly:
- Creates multiple account types atomically
- Uses
prepare_accounts_for_compression_on_init
for batch processing- Combines compression info from both accounts
- Invokes CPI with all accounts in one call
The implementation demonstrates good understanding of the Light Protocol's multi-account compression capabilities.
388-416
: LGTM: Proper existing account compression with timing controls.The handler correctly uses
compress_account
(vscompress_account_on_init
) for existing accounts and includes thecompression_delay
parameter for timing control. Config validation follows the established pattern.
419-576
: Well-designed account contexts with proper constraints and space calculations.The context structs correctly:
- Include compression info in space calculations (e.g., line 429)
- Use appropriate seeds for PDA derivation
- Implement proper ownership and validation constraints
- Handle both init and mutation scenarios appropriately
582-646
: Excellent discriminated union design enabling generic multi-type handling.The
CompressedAccountVariant
enum and its trait implementations are well-designed:
- Enables type-safe handling of multiple account types in decompression
- Correctly implements all required traits with proper delegation
- The discriminator constant (line 604) is appropriately set as unused
This pattern allows the
decompress_accounts_idempotent
instruction to handle multiple account types generically.
648-653
: LGTM: Clean data structure for compressed account instruction parameters.The structure appropriately packages the metadata, account data, and seeds needed for decompression operations.
655-694
: Well-structured compressible account with proper trait implementations.The
UserRecord
struct correctly:
- Uses
#[skip]
for compression_info field (line 658)- Implements required traits for compression support
- Handles the Option properly in trait methods
- Calculates size including discriminator and init space
703-745
: LGTM: Consistent game session account following established compressible patterns.The
GameSession
struct appropriately mirrors theUserRecord
pattern with game-specific fields. The trait implementations are consistent and correct.
747-772
: LGTM: Appropriate error handling and clean parameter structures.The custom error codes address specific validation scenarios, and the helper structs provide clean parameter packaging for complex instructions.
sdk-libs/light-compressible-client/src/lib.rs (5)
1-40
: Excellent generic design with conditional compilation support.The conditional imports for Anchor/non-Anchor serialization (lines 1-4) and generic data structures enable broad reusability across different program types. The documentation is comprehensive and helpful.
42-170
: LGTM: Well-structured instruction builders following Solana SDK patterns.The config instruction builders correctly:
- Use hardcoded discriminators with proper documentation
- Handle PDA derivation for config accounts
- Follow Solana SDK instruction building patterns
- Properly serialize instruction data with discriminators
172-270
: Sophisticated compression instruction builder with proper tree info handling.The builder correctly handles the complex tree info packing and matching logic. The instruction construction properly includes all required accounts and system accounts.
The tree info matching logic (lines 221-234) is necessarily complex but well-implemented for finding the correct PackedStateTreeInfo.
287-389
: Comprehensive multi-account decompression builder with thorough validation.The builder correctly:
- Validates input length consistency (lines 318-323)
- Handles generic account variant types
- Properly constructs complex instructions with multiple accounts
- Includes all required system accounts and metadata
Despite the complexity, the implementation is robust and well-structured.
392-422
: LGTM: Clean instruction data structures with convenient re-export.The generic data structures are well-designed for reusability, and the re-export pattern (line 422) follows Solana SDK conventions for convenient usage.
let error_code: Item = syn::parse_quote! { | ||
#[error_code] | ||
pub enum ErrorCode { | ||
#[msg("Invalid account count: PDAs and compressed accounts must match")] | ||
InvalidAccountCount, | ||
#[msg("Rent recipient does not match config")] | ||
InvalidRentRecipient, | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check for existing ErrorCode enum to avoid conflicts.
The macro unconditionally generates an ErrorCode
enum, which could conflict if the user has already defined one.
Consider checking if ErrorCode already exists:
- Parse existing items to check for ErrorCode
- If it exists, verify it has the required variants
- If not, add the missing variants instead of creating a new enum
Alternatively, namespace the error codes:
-#[error_code]
-pub enum ErrorCode {
+#[error_code]
+pub enum CompressibleErrorCode {
#[msg("Invalid account count: PDAs and compressed accounts must match")]
InvalidAccountCount,
#[msg("Rent recipient does not match config")]
InvalidRentRecipient,
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
let error_code: Item = syn::parse_quote! { | |
#[error_code] | |
pub enum ErrorCode { | |
#[msg("Invalid account count: PDAs and compressed accounts must match")] | |
InvalidAccountCount, | |
#[msg("Rent recipient does not match config")] | |
InvalidRentRecipient, | |
} | |
}; | |
let error_code: Item = syn::parse_quote! { | |
#[error_code] | |
pub enum CompressibleErrorCode { | |
#[msg("Invalid account count: PDAs and compressed accounts must match")] | |
InvalidAccountCount, | |
#[msg("Rent recipient does not match config")] | |
InvalidRentRecipient, | |
} | |
}; |
🤖 Prompt for AI Agents
In sdk-libs/macros/src/compressible.rs around lines 1094 to 1102, the macro
always generates a new ErrorCode enum which can conflict with an existing
user-defined ErrorCode. To fix this, first parse the existing items to check if
an ErrorCode enum is already defined. If it exists, verify it contains the
required variants and add any missing ones instead of creating a new enum. If it
does not exist, then generate the full ErrorCode enum as before. Alternatively,
consider namespacing the error codes to avoid conflicts.
if H::ID != 0 { | ||
output.data_hash[0] = 0; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Document the purpose of hasher ID modification.
The code sets the first byte of data_hash to 0 when H::ID != 0
, but the purpose isn't documented.
Add a comment explaining why this modification is necessary:
output.data_hash = self.account.hash::<H>()?;
if H::ID != 0 {
+ // Set first byte to 0 to differentiate hasher types in the output
+ // This prevents hash collisions between different hasher implementations
output.data_hash[0] = 0;
}
Also verify this pattern is used consistently:
🏁 Script executed:
#!/bin/bash
# Check for other hasher ID usage patterns
ast-grep --pattern 'if $H::ID != 0 { $$$ }'
Length of output: 236
Add Documentation for Hasher ID Modification
The zeroing of the first byte when H::ID != 0
only occurs in sdk-libs/sdk/src/account.rs (lines 334–336). Please add a comment clarifying that this step embeds the hasher type into the output and prevents collisions between different hashers.
Affected location:
- File: sdk-libs/sdk/src/account.rs
Lines: 334–336
Proposed diff:
output.data_hash = self.account.hash::<H>()?;
if H::ID != 0 {
+ // Zero out the first byte to encode hasher type and avoid hash collisions
output.data_hash[0] = 0;
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if H::ID != 0 { | |
output.data_hash[0] = 0; | |
} | |
output.data_hash = self.account.hash::<H>()?; | |
if H::ID != 0 { | |
+ // Zero out the first byte to encode hasher type and avoid hash collisions | |
output.data_hash[0] = 0; | |
} |
🤖 Prompt for AI Agents
In sdk-libs/sdk/src/account.rs at lines 334 to 336, add a comment above the code
that zeroes the first byte of output.data_hash when H::ID is not zero. The
comment should explain that this step embeds the hasher type into the output to
prevent collisions between different hashers, clarifying the purpose of this
modification for future readers.
LightSdkError::ConstraintViolation | ||
})?; | ||
|
||
msg!("Account closed successfully"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Remove success debug message.
This debug message should be removed from production code.
- msg!("Account closed successfully");
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
msg!("Account closed successfully"); |
🤖 Prompt for AI Agents
In sdk-libs/sdk/src/compressible/compress_account_on_init.rs at line 371, remove
the debug message "Account closed successfully" from the code to avoid logging
success messages in production.
/// Test that our instruction builders follow Solana SDK patterns correctly | ||
/// They should return Instruction directly, not Result<Instruction, _> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Inconsistent documentation about return types.
The file header comment states that instruction builders "should return Instruction directly, not Result<Instruction, _>" (lines 9-10), but the decompress_accounts_idempotent
test expects and verifies a Result<Instruction, _>
return type (lines 191-192).
Please update the documentation to accurately reflect that some instruction builders may return Result
for operations that require validation.
Also applies to: 191-192
🤖 Prompt for AI Agents
In sdk-tests/anchor-compressible/tests/test_instruction_builders.rs around lines
9 to 10, the file header comment incorrectly states that instruction builders
should return Instruction directly and not Result<Instruction, _>. However, the
test at lines 191-192 expects a Result return type for some builders. Update the
header comment to clarify that some instruction builders may return
Result<Instruction, _> when validation is required, ensuring consistency with
the test expectations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🔭 Outside diff range comments (1)
sdk-libs/macros/src/lib.rs (1)
252-260
: Update documentation to reflect SHA256 implementationThe documentation comment states this is an "Alias of
LightHasher
", but the implementation actually usesderive_light_hasher_sha
(SHA256) instead ofderive_light_hasher
(Poseidon). This inconsistency could confuse users.Update the documentation to clarify that this uses SHA256:
-/// Alias of `LightHasher`. +/// Alias of `LightHasherSha` - uses SHA256 for hashing.
♻️ Duplicate comments (5)
sdk-libs/macros/src/compressible.rs (2)
95-100
: Clarify the purpose of the placeholder discriminator.The comment "This won't be used directly" could be more descriptive about why a zero discriminator is used here.
impl light_sdk::LightDiscriminator for CompressedAccountVariant { - const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // This won't be used directly + // The enum itself doesn't have a discriminator - each variant uses its struct's discriminator + const LIGHT_DISCRIMINATOR: [u8; 8] = [0; 8]; // Placeholder - actual discriminators come from variant structs const LIGHT_DISCRIMINATOR_SLICE: &'static [u8] = &Self::LIGHT_DISCRIMINATOR; }
339-347
: Consider checking for existing ErrorCode enum to avoid conflicts.The macro unconditionally generates an ErrorCode enum, which could conflict with user-defined enums.
Consider either:
- Checking if ErrorCode exists and adding missing variants
- Using a namespaced enum name like
CompressibleErrorCode
#[error_code] -pub enum ErrorCode { +pub enum CompressibleErrorCode { #[msg("Invalid account count: PDAs and compressed accounts must match")] InvalidAccountCount, #[msg("Rent recipient does not match config")] InvalidRentRecipient, }sdk-libs/sdk/src/account.rs (2)
198-198
: Address the security review TODO for DEFAULT_DATA_HASH.The TODO comments for security review of DEFAULT_DATA_HASH usage are still present. Please complete the security analysis to ensure this sentinel value is safe.
The DEFAULT_DATA_HASH is used as a sentinel for empty account data. Please verify:
- No collision risks with actual account data hashes
- Safe to use in all compression/decompression contexts
- Document the security model
Once verified, update or remove the TODO comments.
Also applies to: 332-333
336-338
: Document the hasher ID modification logic.The code modifies the first byte of data_hash when H::ID != 0, but the purpose isn't documented.
output.data_hash = self.account.hash::<H>()?; if H::ID != 0 { + // Zero the first byte to differentiate hasher types and prevent collisions output.data_hash[0] = 0; }
sdk-libs/macros/src/lib.rs (1)
6-6
: Add missingItemMod
import to fix compilation errorThe
ItemMod
type is used without thesyn::
prefix at line 386, but it's not imported. This will cause a compilation error.Add
ItemMod
to the imports:-use syn::{parse_macro_input, DeriveInput, ItemStruct}; +use syn::{parse_macro_input, DeriveInput, ItemMod, ItemStruct};Also applies to: 386-386
🧹 Nitpick comments (3)
sdk-libs/macros/src/EXAMPLE_USAGE.md (2)
111-111
: Consider simplifying the wording.The phrase "exact same" could be simplified to just "same" for conciseness.
-- **`compress_{type_name}`** - Compresses the PDA using the exact same seeds +- **`compress_{type_name}`** - Compresses the PDA using the same seeds
165-165
: Consider reducing exclamation marks for a more professional tone.Multiple exclamation marks might appear excessive in technical documentation.
-This approach completely solves the external file module limitation while being more robust and user-friendly than any pattern matching could be! +This approach completely solves the external file module limitation while being more robust and user-friendly than any pattern matching could be.sdk-libs/macros/src/compressible.rs (1)
463-471
: Remove unused placeholder function.The
find_seeds_from_registry
function is a placeholder that always returnsNone
and appears to be unused. The actual implementation is infind_seeds_from_registry_in_module
.Consider removing this unused function to reduce confusion:
-/// Find seeds from registry functions generated by derive(Compressible) -fn find_seeds_from_registry(account_type: &Ident) -> Result<Option<SeedInfo>> { - // For now, return a placeholder - we'll implement the actual registry lookup later - // The registry approach needs access to the module content to scan for generated modules - - // Return None for now - this will trigger the error message - // We need to pass the module content to this function to make it work - Ok(None) -}
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
sdk-libs/macros/CHANGELOG.md
(1 hunks)sdk-libs/macros/src/EXAMPLE_USAGE.md
(1 hunks)sdk-libs/macros/src/compressible.rs
(1 hunks)sdk-libs/macros/src/compressible_derive.rs
(1 hunks)sdk-libs/macros/src/lib.rs
(4 hunks)sdk-libs/sdk-types/src/constants.rs
(1 hunks)sdk-libs/sdk/src/account.rs
(9 hunks)sdk-tests/native-compressible/src/lib.rs
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- sdk-libs/sdk-types/src/constants.rs
- sdk-tests/native-compressible/src/lib.rs
🧰 Additional context used
🪛 LanguageTool
sdk-libs/macros/src/EXAMPLE_USAGE.md
[style] ~111-~111: ‘exact same’ might be wordy. Consider a shorter alternative.
Context: ...name}** - Compresses the PDA using the exact same seeds - **
decompress_accounts_idempote...
(EN_WORDINESS_PREMIUM_EXACT_SAME)
[style] ~165-~165: Using many exclamation marks might seem excessive (in this case: 4 exclamation marks for a text that’s 1957 characters long)
Context: ...endly than any pattern matching could be!
(EN_EXCESSIVE_EXCLAMATION)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: system-programs (random-e2e-test, ["cargo-test-sbf -p e2e-test"])
- GitHub Check: lint
- GitHub Check: system-programs (system-cpi-test-v2-functional, ["cargo-test-sbf -p system-cpi-v2-test -- functio...
- GitHub Check: system-programs (system-cpi-test, ["cargo-test-sbf -p system-cpi-test"])
- GitHub Check: system-programs (system-cpi-test-v2-other, ["cargo-test-sbf -p system-cpi-v2-test -- --skip funct...
- GitHub Check: system-programs (system-cpi-test-v2-event, ["cargo-test-sbf -p system-cpi-v2-test -- event::parse"])
- GitHub Check: system-programs (light-compressed-token, ["cargo-test-sbf -p compressed-token-test"])
- GitHub Check: system-programs (light-registry, ["cargo-test-sbf -p registry-test"])
- GitHub Check: system-programs (account-compression, ["cargo-test-sbf -p account-compression-test"])
- GitHub Check: system-programs (light-system-program, ["cargo-test-sbf -p system-test"])
- GitHub Check: stateless-js-v2
- GitHub Check: cli-v1
- GitHub Check: Test sdk-libs
- GitHub Check: cli-v2
- GitHub Check: Test program-libs-slow
- GitHub Check: Test program-libs-fast
- GitHub Check: Test batched-merkle-tree-simulate
- GitHub Check: Test concurrent-merkle-tree
- GitHub Check: examples-tests (sdk-anchor-test-program, ["cargo-test-sbf -p sdk-anchor-test", "cargo-test-sbf -p...
- GitHub Check: Forester e2e test
- GitHub Check: stateless-js-v1
- GitHub Check: sdk-tests
🔇 Additional comments (2)
sdk-libs/macros/src/compressible_derive.rs (1)
1-241
: Well-designed derive macro for automatic seed extraction.The implementation correctly extracts seed information from Anchor account attributes and generates registry modules that can be consumed by the main
add_compressible_instructions
macro. The error handling is comprehensive and the generated code follows good patterns.sdk-libs/macros/CHANGELOG.md (1)
1-107
: Comprehensive changelog with clear migration guide.The changelog effectively documents the breaking changes, new features, and provides detailed migration instructions with code examples. This will greatly help users upgrade to the new version.
seeds_refs.push(&bump_slice); | ||
|
||
// Create LightAccount with correct discriminator | ||
let light_account = light_sdk::account::sha::LightAccount::<'_, #struct_names>::new_mut( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix type parameter in LightAccount creation.
The code uses #struct_names
as a type parameter, but this expands to all struct names. Since we're inside a match arm for a specific variant, we should use that specific type.
- let light_account = light_sdk::account::sha::LightAccount::<'_, #struct_names>::new_mut(
+ let light_account = light_sdk::account::sha::LightAccount::new_mut(
&crate::ID,
&compressed_data.meta,
data,
)?;
// Process this single account
- let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent::<#struct_names>(
+ let compressed_infos = light_sdk::compressible::prepare_accounts_for_decompress_idempotent(
The compiler should be able to infer the correct type from the data
parameter.
Also applies to: 312-312
🤖 Prompt for AI Agents
In sdk-libs/macros/src/compressible.rs at lines 305 and 312, replace the generic
type parameter `#struct_names` used in the `LightAccount::new_mut` call with the
specific type corresponding to the current match arm variant. Remove the
explicit type parameter to allow the compiler to infer the correct type from the
`data` parameter, ensuring the type matches the variant being handled.
#[proc_macro_attribute] | ||
pub fn account(_: TokenStream, input: TokenStream) -> TokenStream { | ||
let input = parse_macro_input!(input as ItemStruct); | ||
|
||
account::account(input) | ||
.unwrap_or_else(|err| err.to_compile_error()) | ||
.into() | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add documentation for the account
attribute macro
This macro lacks documentation. Users need to understand its purpose and usage.
Add documentation similar to other macros in this file:
+/// Attribute macro for defining light protocol accounts.
+///
+/// ## Usage
+/// ```ignore
+/// #[account]
+/// pub struct MyAccount {
+/// // Account fields
+/// }
+/// ```
#[proc_macro_attribute]
pub fn account(_: TokenStream, input: TokenStream) -> TokenStream {
🤖 Prompt for AI Agents
In sdk-libs/macros/src/lib.rs around lines 344 to 351, the `account` attribute
macro lacks documentation. Add a doc comment above the macro definition that
explains its purpose and usage, following the style of other macros in the file.
Include an example usage snippet showing how to apply the `#[account]` attribute
to a struct, to help users understand how to use it.
Summary by CodeRabbit
New Features
Bug Fixes
Tests
Chores
Documentation