Smooth Variational Graph Embeddings for Efficient Neural Architecture Search

Lukasik, Jovita, Friede, David, Zela, Arber, Stuckenschmidt, Heiner, Hutter, Frank, Keuper, Margret

arXiv.org Artificial Intelligence 

This leads to the desire of an accurate space encoding that enables performance prediction In this paper, we propose an approach to neural architecture via surrogates and black-box optimization to find search (NAS) based on graph embeddings. NAS has high-performing architectures in a continuous search space been addressed previously using discrete, sampling based [67]. Zhang et al. [67] propose D-VAE, a graph neural network methods, which are computationally expensive as well as (GNN) [14, 23, 56] based variational neural architecture differentiable approaches, which come at lower costs but embedding with emphasis on the information flow and enforce stronger constraints on the search space. The proposed thereby achieve good results in architecture performance approach leverages advantages from both sides by prediction and BO on the ENAS search space [39] and on a building a smooth variational neural architecture embedding dataset of Bayesian Networks.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found