Deep Learning


Qualcomm selected by DARPA's HIVE Project to accelerate the future of deep learning

#artificialintelligence

References to "Qualcomm"; may mean Qualcomm Incorporated, or subsidiaries or business units within the Qualcomm corporate structure, as applicable. Qualcomm Incorporated includes Qualcomm's licensing business, QTL, and the vast majority of its patent portfolio. Qualcomm Technologies, Inc., a wholly-owned subsidiary of Qualcomm Incorporated, operates, along with its subsidiaries, substantially all of Qualcomm's engineering, research and development functions, and substantially all of its products and services businesses. Qualcomm products referenced on this page are products of Qualcomm Technologies, Inc. and/or its subsidiaries.


Deep Learning Research Review Week 1: Generative Adversarial Nets

@machinelearnbot

This week, I'll be doing a new series called Deep Learning Research Review. The way the authors combat this is by using multiple CNN models to sequentially generate images in increasing scales. The approach the authors take is training a GAN that is conditioned on text features created by a recurrent text encoder (won't go too much into this, but here's the paper for those interested). In order to create these versatile models, the authors train with three types of data: {real image, right text}, {fake image, right text}, and {real image, wrong text}.


Using Apache Spark with Intel BigDL on Mesosphere DC/OS · Blog

#artificialintelligence

Machine learning applications developed using BigDL and Spark can also take advantage of the best-in-class streaming engines, the Lightbend Reactive Platform and messaging technologies like Kafka that form the complete suite of FDP. In this blogpost, Lightbend's Fast Data Platform team and Intel's BigDL team collaborate to describe the experience of implementing and deploying deep learning models on BigDL using Spark on Mesosphere DC/OS. The complete distribution of DC/OS includes a distributed systems kernel, a cluster manager, a container platform and an operating system. The platform layer offers the core datacenter operating system support along with container and cluster management services.


Deep Learning personalization of Internet is next big leap - AI Trends

#artificialintelligence

On a basic conceptual level, deep learning approaches share a very basic trait. Google Translate's science-fiction-like „Word Lens" function is powered by a deep learning algorithm and Deep Mind's recent Go victory can also be attributed to DL – although the triumphant algorithm AlphaGo isn't a pure neural net, but a hybrid, melding deep reinforcement learning with one of the foundational techniques of classical AI -- tree-search. Deep learning is an ample approach to tackling computational problems that are too complicated to solve for simple algorithms, such as image classification or natural language processing. It is quite possible that a large portion of the industries that currently leverage machine learning hold further unexploited potential for deep learning and DL-based approaches can trump current best practices in many of them.




Uncle Sam Wants Your Deep Neural Networks

#artificialintelligence

Earlier this year, Kaggle ran a $1 million contest to build algorithms capable of identifying signs of lung cancer in CT scans, helping to fuel a larger effort to apply neural networks to health care.


Taxonomy of Methods for Deep Meta Learning

#artificialintelligence

Two recent papers that were submitted to ICLR 2017 explore the use of Reinforcement learning to learn new kinds of Deep Learning architectures ("Designing Neural Network Architectures using Reinforcement Learning" and "Neural Architecture Search with Reinforcement Learning"). The second paper (Neural Architecture Search) employs uses Reinforcement Learning (RL) to train a an architecture generator LSTM to build a language that describes new DL architectures. The trained generator RNN is a two-layer LSTM, this RNN generates an architecture that is trained for 50 epochs. We have a glimpse of a DSL driven architecture in my previous post about "A Language Driven Approach to Deep Learning Training" where a prescription that is quite general is presented.


Year in Review: Deep Learning Breakthoughts 2016

@machinelearnbot

When AlphaGo made this move (at 1:18:22 in the video above), it baffled the human experts and is unprecedented by a human, but its genius was only revealed later. This time DeepMind has partnered with Blizzard to allow AI researchers to deploy bots in the StarCraft II game environment. Previous deep learning successes with IBM Deep Blue in chess and DeepMind's AlphaGo have been impressive, but a game like StarCraft presents even greater challenges -- imperfect & dynamic information, how to be able to plan over a longer time horizon and adapt. Google's Multilingual Neural Machine Translation is now able to translate between language pairs that the system has never encountered before.


Cray Moves to Lasso 'Big Data Deluge' EE Times

#artificialintelligence

"All the software in the Big Data Analytics Software Suite is downloadable for the Cray UrikaXC," Tim Barr, Cray's director of analytics and artificial intelligence product strategy, told EE Times in exclusive interview. The components of the Big Data Analytics Software Suite include Cray's own Graph Engine (which includes the some of the company's fastest graph theoretic algorithms available today), the Apache Spark world-famous analytics environment, the BigDL distributed deep learning framework for Spark, the distributed Dask parallel computing libraries for analytics, and widely-used languages for analytics including Python, Scala, Java, and R. All are open source, but for a fee Cray will provide full support for the software suite, including a software subscription that includes maintenance, updates and technical support from Cray. Dask is a flexible parallel computing library for analytics with two components: a dynamic task scheduling optimized for computation; and "Big Data" collections like parallel arrays, dataframes, and lists that run on top of the dynamic task schedulers, according to Barr. Thus, you can move your from analytics and AI workloads to scientific modeling and simulations to data analytics without having to move a massive big data set from the simulation system to a separate analytic system.