neural network


Cheat Sheets for AI, Neural Networks, Machine Learning, Deep Learning & Big Data

#artificialintelligence

Over the past few months, I have been collecting AI cheat sheets. From time to time I share them with friends and colleagues and recently I have been getting asked a lot, so I decided to organize and share the entire collection. To make things more interesting and give context, I added descriptions and/or excerpts for each major topic.


Deep learning with word embeddings improves biomedical named entity recognition Bioinformatics Oxford Academic

#artificialintelligence

Results: We show that a completely generic method based on deep learning and statistical word embeddings [called long short-term memory network-conditional random field (LSTM-CRF)] outperforms state-of-the-art entity-specific NER tools, and often by a large margin. The most fundamental task in biomedical text mining is the recognition of named entities (called NER), such as proteins, species, diseases, chemicals or mutations. Here, we show that an entirely generic NER method based on deep learning and distributional word semantics outperforms such specific high-quality NER methods across different entity types and across different evaluation corpora. We assessed the performance of LSTM-CRF by performing 33 evaluations on 24 different gold standard corpora (some with annotations for more than one entity type) covering five different entity types, namely chemical names, disease names, species names, genes/protein names, and names of cell lines.


Deep Learning: New steps for Natural Language Processing

#artificialintelligence

Natural Language Processing (NLP) of texts has been applied with different degrees of success. Thus, new NLP interesting applications appear such as sentiment analysis (extracting opinions in a user opinion about a product), user wants and needs detection or user profiling. High-level abstraction of texts: Deep Learning technologies wisely combine the aforementioned word representations to obtain a semantic view of more complex texts such as sentences and documents. With this information, computers can take a grasp of the real meaning of texts obtaining better results in comparison with traditional approaches when complex analysis are involved (sentiment analysis, automatic translation, detection of entities in texts, question-answering system, etc).


Deep Instinct Eyes Deep Learning Cybersecurity PYMNTS.com

#artificialintelligence

The artificial intelligence technology is deployed by cybersecurity firms in an effort to keep pace with the evolution of cyberattacks, as machine learning algorithms are able to improve predictability the more it is used. But according to Guy Caspi, CEO of cybersecurity company Deep Instinct, machine learning is no longer enough in an age of unprecedented evolution and volume of cybercrime. Part of that is because machine learning relies on only two or three algorithms; deep learning deploys tens of algorithms, and complex math. But the ongoing evolution of corporate cybercrime means cybersecurity companies may no longer be able to afford relying solely on machine learning.


Death to strap-ons, says Intel, yet thrusts its little AI stick into us all

#artificialintelligence

Now it's shifted its attention to the next thing it will presumably quickly lose interest in: a USB stick for running machine-learning workloads. Meanwhile, Intel wants everyone to pay attention to a USB stick it claims is the next big thing in AI. The "Movidius Neural Compute Stick" claims to offer a complete AI solution in the form of a $79 thumb drive. "The Myriad 2 VPU housed inside the Movidius Neural Compute Stick provides powerful, yet efficient performance – more than 100 gigaflops of performance within a 1W power envelope – to run real-time deep neural networks directly from the device," said Movidious VP Remi El-Ouazzane.


ibms-ai-can-predict-schizophrenia-by-looking-at-the-brains-blo

Engadget

However, pioneering research conducted by IBM and the University of Alberta could soon help doctors diagnose the onset of the disease and the severity of its symptoms using a simple MRI scan and a neural network built to look at blood flow within the brain. The research team first trained its neural network on a 95-member dataset of anonymized fMRI images from the Function Biomedical Informatics Research Network which included scans of both patients with schizophrenia and a healthy control group. From this data, the neural network cobbled together a predictive model of the likelihood that a patient suffered from schizophrenia based on the blood flow. What's more, the model managed to also predict the severity of symptoms once they set in.


Intel puts Movidius AI tech on a $79 USB stick

Engadget

Last year, Movidius announced its Fathom Neural Compute Stick -- a USB thumb drive that makes its image-based deep learning capabilities super accessible. However, Intel has announced that the deep neural network processing stick is now available and going by its new name, the Movidius Neural Compute Stick. "Designed for product developers, researchers and makers, the Movidius Neural Compute Stick aims to reduce barriers to developing, tuning and deploying AI applications by delivering dedicated high-performance deep-neural network processing in a small form factor," said Intel in a statement. The Compute Stick contains a Myriad 2 Vision Processing Unit that uses only around one watt of power.


Are Most Machine Learning Experts Turning to Deep Learning?

#artificialintelligence

It is very desirable to be able to inject domain knowledge into machine learning models, which is something that deep learning methods aren't able to do. We already know quite a bit about English language grammar and sentence construction, why is it then that our latest and greatest deep learning based language model can't be guaranteed to obey those rules? I don't see deep learning as completely over-shadowing machine learning five years down the road. Bio: Zeeshan Zia researches computer vision and machine learning solutions at Microsoft.


Learning to Learn

#artificialintelligence

This differs from many standard machine learning techniques, which involve training on a single task and testing on held-out examples from that task. Like the previous approach, meta-learning is performed using gradient descent (or your favorite neural network optimizer), whereas the learner corresponds to a comparison scheme, e.g. In particular, when approaching any new vision task, the well-known paradigm is to first collect labeled data for the task, acquire a network pre-trained on ImageNet classification, and then fine-tune the network on the collected data using gradient descent. Despite the simplicity of the approach, we were surprised to find that the method was able to substantially outperform a number of existing approaches on popular few-shot image classification benchmarks, Omniglot and MiniImageNet2, including existing approaches that were much more complex or domain specific.


How Is Apple Using Machine Learning? @ThingsExpo #AI #ML #DL #DX #IoT

#artificialintelligence

They use deep learning to extend battery life between charges on their devices and detect fraud on the Apple store, recognize the locations and faces in your photos, and help Apple choose news stories for you. They use deep learning to extend battery life between charges on their devices and detect fraud on the Apple store, recognize the locations and faces in your photos, and help Apple choose news stories for you. The system began leveraging machine learning techniques, including DNN (deep neural networks), long short-term memory units, convolutional neural networks, n-grams, and gate recurrent units. Making mobile AI faster with new machine learning API Apple wants to make the AI on your iPhone as powerful and fast as possible.