Goto

Collaborating Authors

 power map


DeepOHeat: Operator Learning-based Ultra-fast Thermal Simulation in 3D-IC Design

Liu, Ziyue, Li, Yixing, Hu, Jing, Yu, Xinling, Shiau, Shinyu, Ai, Xin, Zeng, Zhiyu, Zhang, Zheng

arXiv.org Artificial Intelligence

Thermal issue is a major concern in 3D integrated circuit (IC) design. Thermal optimization of 3D IC often requires massive expensive PDE simulations. Neural network-based thermal prediction models can perform real-time prediction for many unseen new designs. However, existing works either solve 2D temperature fields only or do not generalize well to new designs with unseen design configurations (e.g., heat sources and boundary conditions). In this paper, for the first time, we propose DeepOHeat, a physics-aware operator learning framework to predict the temperature field of a family of heat equations with multiple parametric or non-parametric design configurations. This framework learns a functional map from the function space of multiple key PDE configurations (e.g., boundary conditions, power maps, heat transfer coefficients) to the function space of the corresponding solution (i.e., temperature fields), enabling fast thermal analysis and optimization by changing key design configurations (rather than just some parameters). We test DeepOHeat on some industrial design cases and compare it against Celsius 3D from Cadence Design Systems. Our results show that, for the unseen testing cases, a well-trained DeepOHeat can produce accurate results with $1000\times$ to $300000\times$ speedup.


A Thermal Machine Learning Solver For Chip Simulation

Ranade, Rishikesh, He, Haiyang, Pathak, Jay, Chang, Norman, Kumar, Akhilesh, Wen, Jimin

arXiv.org Artificial Intelligence

Overlarge peak temperatures and stiff thermal gradients can fatally impact transistor performance, stress, aging, electromigration (EM), voltage drops and timing [18, 10]. Hence accurate prediction of the maximum temperature and thermal gradient on the chip becomes important for the performance and reliability of chip-packaging systems used in several applications such as 5G, automobiles and computational hardware for Artificial Intelligence. Conventional Finite Element Analysis (FEA) or Computational Fluid Dynamics (CFD) based thermal analysis is computationally expensive due to the enormous system parameter space in the form of stiff powermaps and wide range of Heat Transfer Coefficients (HTCs), die thicknesses and chip sizes. As a result, batches of simulations are required to be solved from scratch every time new system parameters of electronic chips are considered. Recently, a multitude of machine learning methods have been proposed to enhance and accelerate physics based numerical solvers in the context of electronic chip simulations.


Thermal and IR Drop Analysis Using Convolutional Encoder-Decoder Networks

Chhabria, Vidya A., Ahuja, Vipul, Prabhu, Ashwath, Patil, Nikhil, Jain, Palkesh, Sapatnekar, Sachin S.

arXiv.org Artificial Intelligence

Computationally expensive temperature and power grid analyses are required during the design cycle to guide IC design. This paper employs encoder-decoder based generative (EDGe) networks to map these analyses to fast and accurate image-to-image and sequence-to-sequence translation tasks. The network takes a power map as input and outputs the corresponding temperature or IR drop map. We propose two networks: (i) ThermEDGe: a static and dynamic full-chip temperature estimator and (ii) IREDGe: a full-chip static IR drop predictor based on input power, power grid distribution, and power pad distribution patterns. The models are design-independent and must be trained just once for a particular technology and packaging solution. ThermEDGe and IREDGe are demonstrated to rapidly predict the on-chip temperature and IR drop contours in milliseconds (in contrast with commercial tools that require several hours or more) and provide an average error of 0.6% and 0.008% respectively.


An unsupervised learning approach to solving heat equations on chip based on Auto Encoder and Image Gradient

He, Haiyang, Pathak, Jay

arXiv.org Machine Learning

Solving heat transfer equations on chip becomes very critical in the upcoming 5G and AI chip-package-systems. However, batches of simulations have to be performed for data driven supervised machine learning models. Data driven methods are data hungry, to address this, Physics Informed Neural Networks (PINN) have been proposed. However, vanilla PINN models solve one fixed heat equation at a time, so the models have to be retrained for heat equations with different source terms. Additionally, issues related to multi-objective optimization have to be resolved while using PINN to minimize the PDE residual, satisfy boundary conditions and fit the observed data etc. Therefore, this paper investigates an unsupervised learning approach for solving heat transfer equations on chip without using solution data and generalizing the trained network for predicting solutions for heat equations with unseen source terms. Specifically, a hybrid framework of Auto Encoder (AE) and Image Gradient (IG) based network is designed. The AE is used to encode different source terms of the heat equations. The IG based network implements a second order central difference algorithm for structured grids and minimizes the PDE residual. The effectiveness of the designed network is evaluated by solving heat equations for various use cases. It is proved that with limited number of source terms to train the AE network, the framework can not only solve the given heat transfer problems with a single training process, but also make reasonable predictions for unseen cases (heat equations with new source terms) without retraining.


CES Showcases Gadgetry, AI and How Tech's Power Map is Changing -- Red Herring

#artificialintelligence

As if 2017 wasn't the year that confirmed anything could be hooked up to the net, CES – traditionally the season opener for the year in tech – went into IoT overload with a bunch of gadgets that pushed the connected envelope. The show, which drew 3,900 exhibitors across 2.75 million square feet (that's a record by the way), could barely have been packed with more exciting consumer products that showed how much smaller our world is getting. Among them Lenovo wowed with a new Smart Display device, Sony's new Aibo robotic dog revisited its breakneck nineties heyday, and Israeli firm Lishtot burst into the public eye with its plectrum-shaped water tester. As expected transport garnered a significant number of headlines. Just about every major automaker signaled its intention to get driverless cars onto the streets as soon as possible (but not yet), while tech companies like Aurora and Voyage impressed with high-tech bells and whistles that will speed up the process.


Chatbot Concept for Otto

#artificialintelligence

Want to watch this again later? Report Need to report the video? Report Need to report the video? Need to report the video? This feature is not available right now.