Machine Learning: AI-Alerts


Visa to Add Supervised Machine Learning to Its Fraud Protection Portfolio

#artificialintelligence

Complex algorithms utilized in data analytics are called unsupervised machine learning, but image recognition or besting the Go champion utilizes supervised machine learning; technology that utilizes neural networks. "The new platform is expected to test algorithms that use an advanced form of AI called deep learning, a technique that has the potential to identify more complex patterns than traditional machine-learning algorithms. "It's a massive breakthrough for us," Mr. Taneja said. Visa currently uses machine-learning algorithms to sift through data to identify anomalies, an effort that prevents billions of dollars in fraudulent transactions annually, Mr. Taneja said. One such Visa fraud-detection system, Advanced Authorization, prevented about $25 billion in fraud in the year ended April 30, according to the company. But the current models have limitations. Researchers must know the signals that might indicate fraud--such as a purchase taking place at an unusual time of day--and write the rules to tell the model what to do when it identifies suspicious activity. Criminal activity sometimes slips by unnoticed because hackers are getting more sophisticated at evading the warning signs that current machine-learning models are trying to detect. Deep-learning models can automatically identify more complex patterns by themselves. For example, if a customer uses his or her card in another country for the first time, deep-learning algorithms will be able to tell, with more accuracy and fewer false positives than traditional machine learning, whether it's a legitimate transaction. The algorithms will be able to take into account previous transactions at airlines and hotels, as long as they are made with Visa cards."


Here's how researchers are making machine learning more efficient and affordable for everyone

#artificialintelligence

The research and development of neural networks is flourishing thanks to recent advancements in computational power, the discovery of new algorithms, and an increase in labelled data. Before the current explosion of activity in the space, the practical applications of neural networks were limited. Much of the recent research has allowed for broad application, the heavy computational requirements for machine learning models still restrain it from truly entering the mainstream. Now, emerging algorithms are on the cusp of pushing neural networks into more conventional applications through exponentially increased efficiency. Neural networks are a prominent focal point in the current state of computer science research.


The little bicycle that could, thanks to artificial intelligence

#artificialintelligence

Machine learning technology has advanced quickly in recent years, but most devices share a common pitfall: the amount of time, energy, and human input required to get the skills of these systems up to snuff. When artificial intelligence learns, it often does so through brute force, cycling through countless rounds of trial and error until it converges on the best set of tactics. People, on the other hand, are much better at thinking on their feet, and require much less brainpower to do so. To bridge this processing gap, many independent groups of computer scientists are trying to build computer chips with an internal architecture that mimics that of the human brain. So-called neuromorphic chips are hybrids.


How Facebook's brain-machine interface measures up

#artificialintelligence

Somewhat unceremoniously, Facebook this week provided an update on its brain-computer interface project, preliminary plans for which it unveiled at its F8 developer conference in 2017. In a paper published in the journal Nature Communications, a team of scientists at the University of California, San Francisco backed by Facebook Reality Labs -- Facebook's Pittsburgh-based division devoted to augmented reality and virtual reality R&D -- described a prototypical system capable of reading and decoding study subjects' brain activity while they speak. It's impressive no matter how you slice it: The researchers managed to make out full, spoken words and phrases in real time. Study participants (who were prepping for epilepsy surgery) had a patch of electrodes placed on the surface of their brains, which employed a technique called electrocorticography (ECoG) -- the direct recording of electrical potentials associated with activity from the cerebral cortex -- to derive rich insights. A set of machine learning algorithms equipped with phonological speech models learned to decode specific speech sounds from the data and to distinguish between questions and responses.


Mphasis launches deep learning algorithms on AWS

#artificialintelligence

Indian software solutions provider Mphasis, which specializes in cloud and cognitive services, has launched its new Deep Learning algorithms. The new algorithms, which will be made available on Amazon Web Services (AWS) Marketplace for Machine Learning, are on-demand solutions targeting practical enterprise use cases such as influence analytics, insurance claims analysis, payment card fraud, and image analytics for supply chain and logistics. The solutions, available for a free trial and download on AWS Marketplace for Machine Learning website, will help users simplify data experimentation, formulate deeper insights from disparate sources across their data estate, and foster new levels of productivity and efficiency for a wide variety of use cases. Some of the algorithms are DeepInsights Card Fraud Analysis that is a Deep-Learning powered classification solution that provides valuable insights from any data that is highly skewed and HyperGraf Auto Claims Prediction which provides occurrence and claim amount predictions for policyholders among others, as per the company statement. Dr Jai Ganesh -Senior Vice President & Head, Mphasis NEXT Labs said "Our solutions target practical, high-value use cases that can deliver immediate impact and ROI in critical enterprise business processes and operations. And users can deploy them with the speed and security provided by AWS." Mphasis is an advanced consulting partner in the AWS Partner Network (APN) and leverages AWS with customers across its business.


Three pitfalls to avoid in machine learning

#artificialintelligence

Researchers at TAE Technologies in California and at Google are using machine learning to optimize equipment that produces a high-energy plasma.Credit: Liz Kuball Machine learning is driving discovery across the sciences. Its powerful pattern finding and prediction tools are helping researchers in all fields -- from finding new ways to make molecules and spotting subtle signals in assays, to improving medical diagnoses and revealing fundamental particles. Yet, machine-learning tools can also turn up fool's gold -- false positives, blind alleys and mistakes. Many of the algorithms are so complicated that it is impossible to inspect all the parameters or to reason about exactly how the inputs have been manipulated. As these algorithms begin to be applied ever more widely, risks of misinterpretations, erroneous conclusions and wasted scientific effort will spiral.


How Robots Are Changing the Way You See a Doctor

#artificialintelligence

The following feature is excerpted from TIME Artificial Intelligence: The Future of Humankind, available at retailers and at the Time Shop and Amazon. Medicine is both art and science. While any doctor will quickly credit her rigorous medical training in the nuts and bolts of how the human body works, she will just as adamantly school you on how virtually all of the decisions she makes--about how to diagnose disease and how best to treat it--are equally the product of some less tangible measures: her experience from previous patients; her cumulative years of watching and learning from patients, colleagues and the human body. Which is why the idea of introducing machines into medicine seems misguided at the very least, and also foolhardy. How can a robot, no matter how well-trained, take the place of a doctor?


Deep Learning Places New Demands on Data Center Architectures

#artificialintelligence

Machine and deep learning applications bring new workflows and challenges to enterprise data center architectures. One of the key challenges revolves around data and the storage solutions needed to store, manage, and deliver up to AI's demands. Today's intelligent applications require infrastructure that is very different from traditional analytics workloads, and an organization's data architecture decisions will have a big impact on the success of its AI projects. These are among the key takeaways from a new white paper by the research firm Moor Insights & Strategy. "While discussions of machine learning and deep learning naturally gravitate towards compute, it's clear that these solutions force new ways of thinking about data," the firm notes in its "Enterprise Machine & Deep Learning with Intelligent Storage" paper.


Doctor Alexa Will See You Now: Is Amazon Primed To Come To Your Rescue?

#artificialintelligence

Now that it's upending the way you play music, cook, shop, hear the news and check the weather, the friendly voice emanating from your Amazon Alexa-enabled smart speaker is poised to wriggle its way into all things health care. Amazon has big ambitions for its devices. It thinks Alexa, the virtual assistant inside them, could help doctors diagnose mental illness, autism, concussions and Parkinson's disease. It even hopes Alexa will detect when you're having a heart attack. At present, Alexa can perform a handful of health care-related tasks: "She" can track blood glucose levels, describe symptoms, access post-surgical care instructions, monitor home prescription deliveries and make same-day appointments at the nearest urgent care center.


What is facial recognition - and how sinister is it?

The Guardian

Facial recognition technology has spread prodigiously. Google, Microsoft, Apple and others have built it into apps to compile albums of people who hang out together. It verifies who you are at airports and is the latest biometric to unlock your mobile phone where facial recognition apps abound. Need to confirm your identity for a £1,000 bank transfer? Just look into the camera.