machine learning


Stanford Algorithm Can Diagnose Pneumonia Better Than Radiologists

IEEE Spectrum Robotics Channel

Stanford researchers have developed a machine-learning algorithm that can diagnose pneumonia from a chest x-ray better than a human radiologist can. And it learned how to do so in just about a month. The Machine Learning Group, led by Stanford adjunct professor Andrew Ng, was inspired by a data set released by the National Institutes of Health on 26 September. The data set contains 112,120 chest X-ray images labeled with 14 different possible diagnoses, along with some preliminary algorithms. The researchers asked four Stanford radiologists to annotate 420 of the images for possible indications of pneumonia.


Deep Neural Networks for Face Detection Explained on Apple's Machine Learning Journal

@machinelearnbot

But bahgawd, the technology Apple is pulling off really is falling squarely into the realm of magical. Anyone else just wowed by the amount of technology embedded into this new iPhone? Our phones are learning more about us then we ever knew Before.


Achieving Accurate, Reliable AI Trajectory Magazine

#artificialintelligence

What will happen to a person's artificial intelligence (AI) when they retire? When a prospective employee interviews for a job, will his or her AI be questioned alongside them? Will companies hire AI straight from a factory, or will the system undergo a sort of apprenticeship before being put to work? More importantly--and more realistic in the near-term--what will be the line at which machines are not reliable enough or morally appropriate to use and humans take over? These, along with many more immediate questions, are among the topics USGIF's Machine Learning & Artificial Intelligence Working Group seeks to generate discussion around.


AIOps tools portend automated infrastructure management

#artificialintelligence

Automated infrastructure management took a step forward with the emergence of AIOps monitoring tools that use machine learning to proactively identify infrastructure problems. Orchestration tools are becoming increasingly popular as part of the DevOps process as they allow admins to focus on more critical tasks, rather than the routine steps it takes to move a workflow along. Our experts analyze the top solutions in the market, namely: Automic, Ayehu, BMC Control-M, CA, Cisco, IBM, Micro Focus, Microsoft, ServiceNow, and VMware. You forgot to provide an Email Address. This email address doesn't appear to be valid.


Machine Learning: Don't Let Weather Fluctuations Take Your Supply Chain by Storm

#artificialintelligence

Weather can cause significant fluctuations in consumer demand, and because of the bullwhip effect, it can produce unnecessarily high fluctuations on the supply side as well. And these types of variations typically turn into costs. Prepare too extensively, and you'll end up breaching the capacity limitations at every level of your supply chain and increasing your fresh goods spoilage, but failing to prepare sufficiently leads to significant lost sales. What's more, lost sales do not only apply to products that go out of stock, especially during extreme weather conditions when customers are more likely to make their decision on which store to visit based on the availability of a key product, for example bottled water, snow shovels, quality barbecue meats or candles. So how can retailers optimally prepare for weather-related fluctuations?


miRAW: A deep learning approach to predict miRNA targets by analyzing whole miRNA transcripts

#artificialintelligence

MicroRNAs (miRNAs) are small non-coding RNAs that regulate gene expression by binding to partially complementary regions within the 3'UTR of their target genes. Computational methods play an important role in target prediction and assume that the miRNA "seed region" (nt 2 to 8) is required for functional targeting, but typically only identify 80% of known bindings. Recent studies have highlighted a role for the entire miRNA, suggesting that a more flexible methodology is needed. We present a novel approach for miRNA target prediction based on Deep Learning (DL) which, rather than incorporating any knowledge (such as seed regions), investigates the entire miRNA and 3'UTR mRNA nucleotides to learn a uninhibited set of feature descriptors related to the targeting process. We collected more than 150,000 experimentally validated homo sapiens miRNA:gene targets and cross referenced them with different CLIP-Seq, CLASH and iPAR-CLIP datasets to obtain 20,000 validated miRNA:gene exact target sites.


Stop Doing Fragile Research

@machinelearnbot

Here's a story familiar to anyone who does research in data science or machine learning: (1) you have a brand-new idea for a method to analyze data (2) you want to test it, so you start by generating a random dataset or finding a dataset online.(3) You apply your method to the data, but the results are unimpressive. And you introduce a hyperparameter into your method so that you can fine-tune it, until (5) the method eventually starts producing gorgeous results. However, in taking these steps, you have developed a fragile method, one that is sensitive to the choice of dataset and customized hyperparameters. Rather than developing a more generaland robust method, you have made the problem easier.


Google launches TensorFlow Lite for machine learning on mobile devices

@machinelearnbot

TensorFlow Lite for machine learning on mobile devices was first announced by Dave Burke, VP of engineering of Android at the Google I/O 2017. TensorFlow Lite is a lightweight version of Google's TensorFlow open source library that is mainly used for machine learning application by researchers and developers. Now, the search giant has launched the developer preview of a new machine learning toolkit designed specifically for smartphones and embedded devices and will be available for both Android and iOS app developers. This platform will allow developers to deploy AI on mobile devices. It enables on-device machine learning inference with low latency and a small binary size.


AI searches for new inspiration

#artificialintelligence

The unpublished work was presented at the Society for Neuroscience's annual meeting in Washington, D.C. It's one example of different kinds of learning that researchers would like to develop in AI -- and one based on aspects of human intelligence that computers haven't mastered yet. The approach is among a few being tried but one that some researchers are excited about because, as Hassabis recently wrote, "[The human brain is] the only existing proof that such an intelligence is even possible." "A lot of the machine learning people now are turning back to neuroscience and asking what have we learned about the brain over the last few decades, and how we can translate principles of neuroscience in the brain to make better algorithms," says Saket Navlakha, a computer scientist at the Salk Institute for Biological Sciences. Last week, he and his colleagues published a paper suggesting that incorporating a strategy used by fruit flies to decide whether to avoid an odor it hasn't encountered before can improve a computer's searches for similar images. The big question for all AI approaches: What problem is a particular algorithm best suited to solve, and will it be better than other AI techniques?


Machine learning and big data: a new dimension in online banking security

@machinelearnbot

Furthermore, effective deployment of machine learning and big data can support sophisticated real-time assessment of the risk inherent in every single online transaction. For each online transaction, banks must pose an apparently simple question: are you a trusted customer or a cybercriminal?