Machine Learning


Testing for a causal effect (with 2 time series)

#artificialintelligence

"… an old variable explains 85% of the change in a new variable. So we can talk about causality" Nevertheless, that was frustrating, and I was wondering if there was a clever way to test for causality in that case. A popular one is Granger causality (I can mention a paper we published a few years ago where we use such a test, Tents, Tweets, and Events: The Interplay Between Ongoing Protests and Social Media). With off-diagonal terms of matrix \Omega, we have a so-called instantaneous causality, and since \Omega is symmetry, we will write x\leftrightarrow y. With off-diagonal terms of matrix \boldsymbol{A}, we have a so-called lagged causality, with either \textcolor{blue}{x\rightarrow y} or \textcolor{red}{x\leftarrow y} (and possibly both, if both terms are significant).



Artificial intelligence yields new antibiotic

#artificialintelligence

Using a machine-learning algorithm, MIT researchers have identified a powerful new antibiotic compound. In laboratory tests, the drug killed many of the world's most problematic disease-causing bacteria, including some strains that are resistant to all known antibiotics. It also cleared infections in two different mouse models. The computer model, which can screen more than a hundred million chemical compounds in a matter of days, is designed to pick out potential antibiotics that kill bacteria using different mechanisms than those of existing drugs. "We wanted to develop a platform that would allow us to harness the power of artificial intelligence to usher in a new age of antibiotic drug discovery," says James Collins, the Termeer Professor of Medical Engineering and Science in MIT's Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering.


AI Is Used to Discover a Novel Antibiotic

#artificialintelligence

Researchers announced the breakthrough discovery of a new type of antibiotic compound that is capable of killing many types of harmful bacteria, including deadly antibiotic-resistant strains, and published their findings in Cell on February 20. What makes this remarkable is that the researchers, from the Massachusetts Institute of Technology (MIT), Harvard, and McMaster University, used machine learning (a form of artificial intelligence) to discover the new antibiotic--an achievement that heralds the disruption of traditional research and drug development processes deployed by pharmaceutical industry behemoths. Antibiotic resistance is a global threat that is exacerbated by the overuse of antibiotics in livestock, the proliferation of antimicrobials in consumer products, and over-prescription in health care. Though estimating the future impact is challenging, one report predicted that by 2050, 10 million deaths per year could result from antimicrobial-resistant (AMR) infections. Combating the problem of antimicrobial resistance requires bringing novel compounds to market.


Microsoft Injects New AI Features Into Dynamics 365

#artificialintelligence

Microsoft on Wednesday unveiled several new artificial intelligence capabilities across Dynamics 365 applications and a new solution to help project-centric services organizations transform their operations. The AI enhancements include first- and third-party data connections in Dynamics 365 Customer Insights, Microsoft's customer data platform (CDP). "The work in AI and CDP is new and a key part of Microsoft taking their products to an AI-driven approach," noted Ray Wang, principal analyst at Constellation Research. The company also unveiled new manual and predictive forecasting capabilities for Dynamics 365 Sales and Dynamic 365 Sales Insights. "Integration with the CDP is important, but more important will be the ability to automate transactions and apply AI to drive the next best action," Wang told CRM Buyer.


A human-machine collaboration to defend against cyberattacks

#artificialintelligence

Being a cybersecurity analyst at a large company today is a bit like looking for a needle in a haystack -- if that haystack were hurtling toward you at fiber optic speed. Every day, employees and customers generate loads of data that establish a normal set of behaviors. An attacker will also generate data while using any number of techniques to infiltrate the system; the goal is to find that "needle" and stop it before it does any damage. The data-heavy nature of that task lends itself well to the number-crunching prowess of machine learning, and an influx of AI-powered systems have indeed flooded the cybersecurity market over the years. But such systems can come with their own problems, namely a never-ending stream of false positives that can make them more of a time suck than a time saver for security analysts.


Microsoft DoWhy is an Open Source Framework for Causal Reasoning

#artificialintelligence

The human mind has a remarkable ability to associate causes with a specific event. From the outcome of an election to an object dropping on the floor, we are constantly associating chains of events that cause a specific effect. Neuropsychology refers to this cognitive ability as causal reasoning. Computer science and economics study a specific form of causal reasoning known as causal inference which focuses on exploring relationships between two observed variables. Over the years, machine learning has produced many methods for causal inference but they remain mostly difficult to use in mainstream applications.


Arm announces new processor IPs for AI and machine learning - KitGuru

#artificialintelligence

Arm has announced details of its latest processors designed for artificial intelligence and machine learning, the Arm Cortex-M55, as well as the first microNPU (Neural Processing Unit), the Ethos-U55 which offer a combined 480x machine learning improvement for microcontrollers. Cortex-M based processors are already powering a vast range of AI products with over 50 billion chips shipped to partners. Arm claims that its latest Cortex-M55 is its most capable AI processor yet and is the company's first Cortex-M processor to be based on the Armv8.1-M Equipped with Arm Helium vector processing technology, the Cortex-M55 offers significantly enhanced energy-efficient, 5x digital signal processing (DSP) performance improvement and 15x machine learning (ML) performance compared to previous Cortex-M generations. In addition, custom instructions will be added to improve processor performance for specific workloads, which is a new feature for Cortex-M series processors.


r/MachineLearning - [R] Bayesian Deep Learning and a Probabilistic Perspective of Generalization

#artificialintelligence

Abstract: The key distinguishing property of a Bayesian approach is marginalization, rather than using a single setting of weights. Bayesian marginalization can particularly improve the accuracy and calibration of modern deep neural networks, which are typically underspecified by the data, and can represent many compelling but different solutions. We show that deep ensembles provide an effective mechanism for approximate Bayesian marginalization, and propose a related approach that further improves the predictive distribution by marginalizing within basins of attraction, without significant overhead. We also investigate the prior over functions implied by a vague distribution over neural network weights, explaining the generalization properties of such models from a probabilistic perspective. From this perspective, we explain results that have been presented as mysterious and distinct to neural network generalization, such as the ability to fit images with random labels, and show that these results can be reproduced with Gaussian processes.


Heart arrhythmia detection using Deep Learning

#artificialintelligence

A common problem that Deep Learning is helping to solve lately involves time series classification. A classic approach to this kind of problem is generating features from the signals we have and training a machine learning model. The process of handcrafting features might take a great chunk of your project schedule. This architecture has proven to be effective in order to reduce the amount of time spent on feature engineering. In this article, we're going to train a couple of models to detect irregular heart rhythms.