Goto

Collaborating Authors

CNN Architectures, a Deep-dive

#artificialintelligence

VGG Net is a plain and straight forward CNN architecture among all other. Thought it looks simple, it do outperform many complex architectures. It is the 1st runner-up in ImageNet Challenge in 2014. As shown above, there are totally 6 VGGNet Architectures. Among them, VGG-16 and VGG-19 are popular.


Neural Architecture Search in Embedding Space

arXiv.org Machine Learning

The neural architecture search (NAS) algorithm with reinforcement learning can be a powerful and novel framework for the automatic discovering process of neural architectures. However, its application is restricted by noncontinuous and high-dimensional search spaces, which result in difficulty in optimization. To resolve these problems, we proposed NAS in embedding space (NASES), which is a novel framework. Unlike other NAS with reinforcement learning approaches that search over a discrete and high-dimensional architecture space, this approach enables reinforcement learning to search in an embedding space by using architecture encoders and decoders. The current experiment demonstrated that the performance of the final architecture network using the NASES procedure is comparable with that of other popular NAS approaches for the image classification task on CIFAR-10. The beneficial-performance and effectiveness of NASES was impressive even when only the architecture-embedding searching and pre-training controller were applied without other NAS tricks such as parameter sharing. Specifically, considerable reduction in searches was achieved by reducing the average number of searching to 100 architectures to achieve a final architecture for the NASES procedure. Introduction Deep neural networks have enabled advances in image recognition, sequential pattern recognition, recommendation systems, and various tasks in the past decades.


How to decide what neural network architecture to use? • /r/MachineLearning

@machinelearnbot

The problem comes when he talks about the hidden layers. He basically says that the more hidden layers the better(at the price of being more'expensive' to compute) and that the amount of neurons in each layer should be a comparable amount of the number of initial inputs or greater. But this explanation seems kind of vague/random, there is an infinite amount of combinations you can choose from: You just go trying one by one until one architecture seems to work? For example, what architecture would you use to make a program that distinguishes numbers from 1 to 10, say, on a 50x50 pixel window? How would you come up with that?


What is Machine Learning Architecture? Do you have examples? • /r/MachineLearning

@machinelearnbot

What is Machine Learning Architecture? I'm looking for a new job in Data Science, and have found job positions titled as "Machine Learning Architect". Don't really sure about what it is, but maybe I am still a newbie. What does a Machine Learning Architect do and what are the required skills? Can you give examples, please?


Global Big Data Conference

#artificialintelligence

The semiconductor is the foundational technology of the digital age. It gave Silicon Valley its name. It sits at the heart of the computing revolution that has transformed every facet of society over the past half-century. The pace of improvement in computing capabilities has been breathtaking and relentless since Intel introduced the world's first microprocessor in 1971. In line with Moore's Law, computer chips today are many millions of times more powerful than they were fifty years ago.