DeepGate:learning neural representations of logic gates


Applying deep learning (DL) techniques in the electronic design automation (EDA) field has become a trending topic. Most solutions apply well-developed DL models to solve specific EDA problems. While demonstrating promising results, they require careful model tuning for every problem. The fundamental question on “How to obtain a general and effective neural representation of circuits?” has not been answered yet.

In this work, we take the first step towards solving this problem. We propose DeepGate, a novel representation learning solution that effectively embeds both logic function and structural information of a circuit as vectors on each gate. Specifically, we propose transforming circuits into unified and-inverter graph format for learning and using signal probabilities as the supervision task in DeepGate. We then introduce a novel graph neural network that uses strong inductive biases in practical circuits as learning priors for signal probability prediction. Our experimental results show the efficacy and generalization capability of DeepGate.


1. The Overview of DeepGate.


2. The Performance Comparison of DeepGate with other GNN models for Logic Probability Prediction


3. The Performance Comparison of DeepGate and DeepSet on Five Large Circuits


@inproceedings{li2022deepgate,
  title={Deepgate: Learning neural representations of logic gates},
  author={Li, Min and Khan, Sadaf and Shi, Zhengyuan and Wang, Naixing and Yu, Huang and Xu, Qiang},
  booktitle={Proceedings of the 59th ACM/IEEE Design Automation Conference},
  pages={667--672},
  year={2022}
}