Goto

Collaborating Authors

Task-Aware Neural Architecture Search

arXiv.org Artificial Intelligence

The design of handcrafted neural networks requires a lot of time and resources. Recent techniques in Neural Architecture Search (NAS) have proven to be competitive or better than traditional handcrafted design, although they require domain knowledge and have generally used limited search spaces. In this paper, we propose a novel framework for neural architecture search, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary; hence, generating an adaptive search space based on the base models of the dictionary. By introducing a gradient-based search algorithm, we can evaluate and discover the best architecture in the search space without fully training the networks. The experimental results show the efficacy of our proposed task-aware approach.


Top Works In Neural Architecture Search Domain

#artificialintelligence

Currently employed neural network architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. This is when Neural architecture search, a subset of AutoML, came to the rescue. Neural Architecture Search (NAS) is the process of automating architecture engineering. Here we list top research works in Neural Architecture Search based on their popularity on Github. These works have set new baselines, resulted in new networks and more.


Visual Steering for One-Shot Deep Neural Network Synthesis

arXiv.org Machine Learning

Our visual steering interface purposed to guide analysts in the task of constructing the best performing deep neural network architecture for a given application using a one-shot search algorithm. The first section is the Lego View where the analyst can create and edit different components of a large neural network with simple drag and drop operations. An initial large neural network is treated as a super graph (shown in the Graph View) and the problem of finding the best performing neural network architecture is framed as searching for the respective subgraph in this super graph. The Graph View visualizes the super graph where each node is a block (sequence of neural network components). The One-Shot Search algorithm evaluates the subgraphs of this super graph iteratively, gauges their accuracy with regards to a test dataset and provides a fitness score for each node in the graph (Block Information view). The subgraphs are then projected as points into the scatterplot in the Search Space view and colored based on their evaluation accuracy. Analysts can filter and analyze a specific region in the subgraph search space with zoom and pan operations in the Search Space View. Finally, all blocks with high fitness scores are combined to create the best performing candidate neural network architecture. Abstract--Recent advancements in the area of deep learning have shown the effectiveness of very large neural networks in several applications. However, as these deep neural networks continue to grow in size, it becomes more and more difficult to configure their many parameters to obtain good results. Presently, analysts must experiment with many different configurations and parameter settings, which is labor-intensive and time-consuming. On the other hand, the capacity of fully automated techniques for neural network architecture search is limited without the domain knowledge of human experts.


Automatically finding the best Neural Network for your GAN

#artificialintelligence

Generative Adversarial Networks (GANs) have been a hot topic in Deep Learning ever since their initial invention and publication at NIPS 2014. There's a good reason for it all: GANs can create new content based on only a small bit of guidance. It's that sort of creativity which makes them so powerful. Just to name a few there. With all of this, massive resources are being poured into GAN research to figure out both how they work and how to design the absolute best GAN networks.


A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions

arXiv.org Machine Learning

Deep learning has made major breakthroughs and progress in many fields. This is due to the powerful automatic representation capabilities of deep learning. It has been proved that the design of the network architecture is crucial to the feature representation of data and the final performance. In order to obtain a good feature representation of data, the researchers designed various complex network architectures. However, the design of the network architecture relies heavily on the researchers' prior knowledge and experience. Therefore, a natural idea is to reduce human intervention as much as possible and let the algorithm automatically design the architecture of the network. Thus going further to the strong intelligence. In recent years, a large number of related algorithms for \textit{Neural Architecture Search} (NAS) have emerged. They have made various improvements to the NAS algorithm, and the related research work is complicated and rich. In order to reduce the difficulty for beginners to conduct NAS-related research, a comprehensive and systematic survey on the NAS is essential. Previously related surveys began to classify existing work mainly from the basic components of NAS: search space, search strategy and evaluation strategy. This classification method is more intuitive, but it is difficult for readers to grasp the challenges and the landmark work in the middle. Therefore, in this survey, we provide a new perspective: starting with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms, and then giving solutions for subsequent related research work. In addition, we conducted a detailed and comprehensive analysis, comparison and summary of these works. Finally, we give possible future research directions.