deep learning


Training and Visualising Word Vectors

@machinelearnbot

In this tutorial I want to show how you can implement a skip gram model in tensorflow to generate word vectors for any text you are working with and then use tensorboard to visualize them. I found this exercise super useful to i) understand how skip gram model works and ii) get a feel for the kind of relationship these vectors are capturing about your text before you use them downstream in CNNs or RNNs. I trained a skip gram model on text8 dataset which is collection of English Wikipedia articles. I used Tensorboard to visualize the embeddings. Tensorboard allows you to see the whole word cloud by using PCA to select 3 main axis to project the data.


Getting to the Heart of Arrhythmia with GPU-Powered AI NVIDIA Blog

#artificialintelligence

Artificial intelligence is quickly evolving into a lifesaver. Two separate efforts in the commercial and academic arenas have inched us closer to taking a bite out of heart disease -- the world's no. 1 killer. A Stanford University team led by Andrew Ng and a Silicon Valley startup are tapping the power of AI to improve detection of abnormalities and increase the accuracy of diagnoses. Medical-device maker AliveCor, based in Mountain View, is building deep learning AI algorithms to enable people to monitor their heart rates using built-in sensors on the Apple Watch. They can even alert people to take an immediate EKG using an Apple Watch app and a specially designed band with a built-in sensor.


Using Genetic Algorithm for Optimizing Recurrent Neural Networks

@machinelearnbot

Several tools are available (e.g. AutoML and TPOT), that can aid the user in the process of performing hundreds of experiments efficiently. Although, this approach resulted in state-of-the-art models in several domains but is very time-consuming. Lately, due to increase in available computing power, researchers are employing Reinforcement Learning and Evolutionary Algorithms to automatically search for optimal neural architectures. In this tutorial, we will see how to apply a Genetic Algorithm (GA) for finding an optimal window size and a number of units in Long Short-Term Memory (LSTM) based Recurrent Neural Network (RNN).


IBM, Intel Rethink Processor Designs to Accommodate AI Workloads - The New Stack

#artificialintelligence

Artificial intelligence is bringing new demands to processors. The algorithmic data crunching is different from earlier models of processing data highlighted by benchmarks like LINPACK. It is also changing computing architectures by de-emphasizing the CPU and harnessing the faster computing power of coprocessors. The CPU is just a facilitator, and a lot of deep-learning is done on accelerator chips like GPUs, FPGAs and Google's Tensor processing unit. Major hardware companies like IBM, Intel, Nvidia and AMD are embracing the change in architecture and tuning hardware that encourage the creation of artificial neural nets, as envisioned by researchers in 1960s.


Transfer Learning using differential learning rates

@machinelearnbot

In this post, I will be sharing how one can use popular deep learning models for their own specific task using transfer learning. We will cover some concepts like differential learning rates which are not even currently in implementation in some of the deep learning libraries. I have learned about these from the fast.ai This course content will be available to the general public early 2018 as a MOOC. It is the process of using the knowledge learned in one process/activity and applying it to a different task.


Data Science Bowl 2018: A Deep Learning Drive

@machinelearnbot

For the next 90 days, data scientists will have the chance to submit algorithms that can identify nuclei in cell samples without human intervention.


Faster R-CNN: Down the rabbit hole of modern object detection - Tryolabs Blog

@machinelearnbot

Previously, we talked about object detection, what it is and how it has been recently tackled using deep learning. If you haven't read our previous blog post, we suggest you take a look at it before continuing. Last year, we decided to get into Faster R-CNN, reading the original paper, and all the referenced papers (and so on and on) until we got a clear understanding of how it works and how to implement it. We ended up implementing Faster R-CNN in Luminoth, a computer vision toolkit based on TensorFlow which makes it easy to train, monitor and use these types of models. So far, Luminoth has raised an incredible amount of interest and we even talked about it at both ODSC Europe and ODSC West.


6 ways hackers will use machine learning to launch attacks

#artificialintelligence

Defined as the "ability for (computers) to learn without being explicitly programmed," machine learning is huge news for the information security industry. It's a technology that potentially can help security analysts with everything from malware and log analysis to possibly identifying and closing vulnerabilities earlier. Perhaps too, it could improve endpoint security, automate repetitive tasks, and even reduce the likelihood of attacks resulting in data exfiltration. Naturally, this has led to the belief that these intelligent security solutions will spot - and stop - the next WannaCry attack much faster than traditional, legacy tools. "It's still a nascent field, but it is clearly the way to go in the future.


Artificial Neural Networks & It's Applications - XenonStack Blog

#artificialintelligence

Artificial Neural Networks are the computational models inspired by the human brain. Many of the recent advancements have been made in the field of Artificial Intelligence, including Voice Recognition, Image Recognition, Robotics using Artificial Neural Networks. These biological methods of computing is considered to be the next major advancement in the Computing Industry. The term'Neural' is derived from the human (animal) nervous system's basic functional unit'neuron' or nerve cells which are present in the brain and other parts of the human (animal) body. It receives signals from other neurons.


AI Definitions: Machine Learning vs. Deep Learning vs. Cognitive Computing vs. Robotics vs. Strong AI….

#artificialintelligence

AI is the compelling topic of tech conversations du jour, yet within these conversations confusion often reigns – confusion caused by loose use of AI terminology. The problem is that AI comes in a variety of forms, each one with its own distinct range of capabilities and techniques, and at its own stage of development. Some forms of AI that we frequently hear about, such as Artificial General Intelligence, the kind of AI that might someday automate all work and that we might lose control of – may never come to pass. Others are doing useful work and are driving growth in the high performance sector of the technology industry. These definitions aren't meant to be the final word on AI terminology, the industry is growing and changing so fast that terms will change and new ones will be added.