Goto

Collaborating Authors

 Deep Learning


Field Report: GPU Technology Conference 2016 - insideBIGDATA

#artificialintelligence

In summary, I had a blast at my first GTC. The only downside was that I wasn't on-site long enough to totally absorb everything, certainly not even a fraction of all the great talks on Deep Learning and AI. But no worries, I treated my attendance as a learning experience and I fully intend to drill down on many areas of interest after-the-fact (starting with this field report). As I sat in the conference press room watching the frenetic activity of the attendees passing by, I anticipated hours of fun digesting all that I saw. Look for many future articles here on insideBIGDATA that cover GPU technology, NVIDIA, the vendors I met, as well as leading-edge research taking place in this space. I'm excited, and I hope you are too!


Modern LSTM Architectures? • /r/MachineLearning

@machinelearnbot

Is there a paper that describes modern LSTM architectures, in the same way that the GoogLeNet paper is representative of modern CNNs?


Implementing a Distributed Deep Learning Network over Spark

@machinelearnbot

Deep learning is becoming an important AI paradigm for pattern recognition, image/video processing and fraud detection applications in finance. The computational complexity of a deep learning network dictates need for a distributed realization. Our intention is to parallelize the training phase of the network and consequently reduce training time. We have built the first prototype of our distributed deep learning network over Spark, which has emerged as a de-facto standard for realizing machine learning at scale. Geoffrey Hinton presented the paradigm for fast learning in a deep belief network [Hinton 2006].



Nvidia Puts The Accelerator To The Metal With Pascal

#artificialintelligence

The revolution in GPU computing started with games, and spread to the HPC centers of the world eight years ago with the first "Fermi" Tesla accelerators from Nvidia. But hyperscalers and their deep learning algorithms are driving the architecture of the "Pascal" GPUs and the Tesla accelerators that Nvidia unveiled today at the GPU Technical Conference in its hometown of San Jose. Not only did the hyperscalers and their AI efforts help drive the Pascal architecture, but they will be the first companies to get their hands on all of the Tesla P100 accelerators based on the Pascal GP100 GPU that Nvidia can manufacture, long before they become generally available in early 2017 through server partners who make hybrid CPU-GPU systems. As was the case with the prior generations of GPU compute engines, Nvidia will eventually offer multiple versions of the Pascal GPU for specific workloads and use cases, but Nvidia has made the big bet and created its high-end GP100 variant of Pascal and making other big bets at the same time, such as moving to a 16 nanometer FinFET process from chip fab partner Taiwan Semiconductor Manufacturing Corp and adding in High Bandwidth Memory from memory partner Samsung at the same time. Jen-Hsun Huang, co-founder and CEO at Nvidia, said during his opening keynote that Nvidia has a rule about how many big bets it can make.


Artificial Intelligence Helps Diagnose Cancer

#artificialintelligence

Which are the cancerous cells in this image? With a new technique that combines a microscope and deep learning software, it might be easier than ever to tell the difference. Identifying cancer based on blood samples can be surprisingly challenging. Often, doctors add chemicals to a sample that can make the cancerous cells visible, but that makes the sample impossible to use in other tests. Other techniques identify cancerous cells based on their abnormal structure, but those take more time (those cells are often rare in a given sample) and can misidentify healthy misshapen cells as cancerous.


Now Anyone Can Use Google's Deep Learning Techniques

#artificialintelligence

Google has announced a new machine learning platform for developers at its NEXT Google Cloud Platform user conference. Eric Schmidt, Google's chairman, explained that Google believes machine learning is "what's next." There are two parts to Google's Cloud Machine Learning platform. The first allows developers to build machine learning models based on their own data stored in tools such as Google Cloud Dataflow, Google BigQuery, Google Cloud Dataproc, Google Cloud Storage, and Google Cloud Datalab. The pre-trained models include existing APIs like the Google Translate API and Cloud Vision API, but also new services like the Google Cloud Speech API.


Bottoming Out

#artificialintelligence

In order to get a grasp on what makes optimization difficult in machine learning, it is important to specialize our focus. Nonsmooth optimization is so general, and what makes deep learning hard may be completely different from what makes tensor decomposition difficult. So in this post, I want to focus on deep learning and take a bit of a controversial stand. It has been my experience, that optimization is not at all what makes deep learning challenging. On the left I show the training error on everyone's favorite machine learning benchmark MNIST.


Machine Learning Could Be Weaponized In Fight Against ISIS

#artificialintelligence

Deep learning machines could help decode ISIS as a network and develop strategy for defeat. The use of deep learning machines could help the Pentagon decode the structure of ISIS as a network and allow for a more precisely, developed strategy for its defeat, according to Pentagon Deputy Secretary Robert Work. He was making the case for using artificial intelligence (A.I.) for open-source data crunching, Inverse.com "We are absolutely certain that the use of deep-learning machines is going to allow us to have a better understanding of ISIL as a network and better understanding about how to target it precisely and lead to its defeat," said Secretary Work, according to DoD's website. Speaking at an event organized by the Washington Post, Work said he had his epiphany while watching a Silicon Valley tech company demonstrate "a machine that took in data from Twitter, Instagram, and many other public sources to show the July 2014 Malaysia Airlines Flight 17 shoot-down in real time."


DeepMind Could Bring The Best News Recommendation Engine -- Monday Note

#artificialintelligence

Reinforcement Learning, a key Google DeepMind algorithm, could overhaul news recommendation engines and greatly improve users stickiness. After beating a Go Grand Master, the algorithm could become the engine of choice for true personalization. My interest for DeepMind goes back to its acquisition by Google, in January 2014, for about half a billion dollars. Later in California, I had conversations with Artificial Intelligence and deep learning experts; they said Google had in fact captured about half of the world's best A.I. minds, snatching several years of Stanford A.I. classes, and paying top dollar for talent. Acquiring London startup Deep Mind was a key move in a strategy aimed at cornering the A.I. field.