RCNet: $ΔΣ$ IADCs as Recurrent AutoEncoders
Verdant, Arnaud, Guicquero, William, Chossat, Jérôme
–arXiv.org Artificial Intelligence
This paper proposes a deep learning model (RCNet) for Delta-Sigma ($ΔΣ$) ADCs. Recurrent Neural Networks (RNNs) allow to describe both modulators and filters. This analogy is applied to Incremental ADCs (IADC). High-end optimizers combined with full-custom losses are used to define additional hardware design constraints: quantized weights, signal saturation, temporal noise injection, devices area. Focusing on DC conversion, our early results demonstrate that $SNR$ defined as an Effective Number Of Bits (ENOB) can be optimized under a certain hardware mapping complexity. The proposed RCNet succeeded to provide design tradeoffs in terms of $SNR$ ($>$13bit) versus area constraints ($<$14pF total capacitor) at a given $OSR$ (80 samples). Interestingly, it appears that the best RCNet architectures do not necessarily rely on high-order modulators, leveraging additional topology exploration degrees of freedom.
arXiv.org Artificial Intelligence
Jun-23-2025
- Country:
- Europe > France
- Auvergne-Rhône-Alpes > Isère > Grenoble (0.04)
- North America > United States
- Florida > Palm Beach County > Boca Raton (0.04)
- Europe > France
- Genre:
- Research Report > New Finding (0.34)
- Technology: