Presentation
FuncFormer: Circuit Representation Learning via the Flow of Functional Propagation
DescriptionLearning expressive and generalizable representations from raw circuit graphs is crucial for advancing various tasks in electronic design automation (EDA). However, circuit representation learning is extremely challenging due to the intricate topological structure and rich functional semantics of circuits. To address this challenge, we propose a novel graph transformer architecture, called FuncFormer, which integrates the flow of functional propagation into the representation space to enable expressive and generalizable circuit representations. The major insight behind FuncFormer is the identification that capturing the flow of functional propagation is fundamental to circuit representation learning, as it inherently encapsulates both the circuit structure and functionality. Specifically, FuncFormer initializes input node features using functional input signals and propagates these features along the directions of signal flow based on the Boolean functionality of each logic gate (i.e., node). Subsequently, FuncFormer employs a self-attention module to produce final representations by attentively embedding the generated functional features. To demonstrate the effectiveness of FuncFormer, we evaluate FuncFormer on three typical EDA tasks: Quality of Result (QoR) prediction, functional reasoning, and formal property verification. The experiments show that (1) FuncFormer reduces the estimation error by 48.49% compared to the state-of-the-art (SOTA) methods for QoR prediction after logic synthesis; (2) FuncFormer improves the reasoning accuracy by 12.07% over the SOTA approach for identifying logically equivalent gates; (3) FuncFormer increases the number of solved properties by 4.77 and 2.80 times over manually designed heuristics and typical GNNs, respectively, on large-scale sequential circuits.
Event Type
Networking
Work-in-Progress Poster
TimeMonday, June 236:00pm - 7:00pm PDT
LocationLevel 2 Lobby


