lukasik
Learning Where To Look -- Generative NAS is Surprisingly Efficient
Lukasik, Jovita, Jung, Steffen, Keuper, Margret
The efficient, automated search for well-performing neural architectures (NAS) has drawn increasing attention in the recent past. Thereby, the predominant research objective is to reduce the necessity of costly evaluations of neural architectures while efficiently exploring large search spaces. To this aim, surrogate models embed architectures in a latent space and predict their performance, while generative models for neural architectures enable optimization-based search within the latent space the generator draws from. Both, surrogate and generative models, have the aim of facilitating query-efficient search in a well-structured latent space. In this paper, we further improve the trade-off between query-efficiency and promising architecture generation by leveraging advantages from both, efficient surrogate models and generative design. To this end, we propose a generative model, paired with a surrogate predictor, that iteratively learns to generate samples from increasingly promising latent subspaces. This approach leads to very effective and efficient architecture search, while keeping the query amount low. In addition, our approach allows in a straightforward manner to jointly optimize for multiple objectives such as accuracy and hardware latency. We show the benefit of this approach not only w.r.t. the optimization of architectures for highest classification accuracy but also in the context of hardware constraints and outperform state-of-the-art methods on several NAS benchmarks for single and multiple objectives. We also achieve state-of-the-art performance on ImageNet. The code is available at http://github.com/jovitalukasik/AG-Net .
- Europe > Russia (0.04)
- Europe > Germany > Saarland (0.04)
- Asia > Russia > Siberian Federal District > Novosibirsk Oblast > Novosibirsk (0.04)
- Asia > Japan > Honshū > Kantō > Chiba Prefecture > Chiba (0.04)
Stephen Lukasik, Who Pushed Tech in National Defense, Dies at 88
His incentive at the time, he wrote in a reminiscence, was to assist the National Security Agency, which employed "vast numbers of transcribers and translators to make sense of a multitude of communication channels they monitored." In one instance he had ARPA researchers work on using artificial intelligence to transcribe manual Morse code. "In my view, he was one of the few people who really thought about how science and technology serve national security," said Sharon Weinberger, author of "The Imagineers of War: The Untold Story of DARPA, the Pentagon Agency That Changed the World" (2017). "He saw the role of strategy, not just widgets or weapons to serve the Pentagon, but the bigger picture around it." Dr. Lukasik was an early champion of the Arpanet, which began as an experiment in computer networking.
- North America > United States > New York > Richmond County > New York City (0.07)
- Asia > Japan (0.07)
- Government > Military (1.00)
- Government > Regional Government > North America Government > United States Government (0.59)
Convolution Kernels for Discriminative Learning from Streaming Text
Lukasik, Michal (University of Sheffield) | Cohn, Trevor (University of Melbourne)
Time series modeling is an important problem with many applications in different domains. Here we consider discriminative learning from time series, where we seek to predict an output response variable based on time series input. We develop a method based on convolution kernels to model discriminative learning over streams of text. Our method outperforms competitive baselines in three synthetic and two real datasets, rumour frequency modeling and popularity prediction tasks.