- Version
- Download 9
- File Size 679.31 KB
- File Count 1
- Create Date January 23, 2026
- Last Updated January 23, 2026
CROSS-PARTITION ATTENTION (CPA): A PROPOSED SCALABLE TRANSFORMER FRAMEWORK FOR SMART CONTRACT VULNERABILITY DETECTION
ABSTRACT
Blockchain’s decentralized trust model relies heavily on secure smart contracts. However, vulnerabilities such as reentrancy and integer overflow persist due to the limitations of existing detection tools. While transformer-based models (e.g., CodeBERT) offer improvements over traditional rule-based methods, they struggle with context truncation and token-level bias, often failing to capture cross-function vulnerabilities in lengthy contracts due to 512 window limits. We propose Cross-Partition Attention (CPA), a scalable framework designed to overcome these limitations through: (1) window level partitioning to preserve the full contract context without truncation, (2) Line-of-Code (LoC) embeddings that combine CodeBERT semantics with positional encoding to model execution-order risks, and (3) Security-centric contrastive learning to distinguish between vulnerable and safe patterns. Cs than state-of-the-art tools such as Slither and MythX, and offer more efficient inference compared to standard CodeBERT transformers. Additionally, CPA will generate interpretable attention maps to aid in vulnerability localization and detection. CPA promises to bridge the gap between scalability and precision in smart contract auditing, offering practical applications for Blockchain 3.0 platforms like Ethereum.
