Machine Learning


Google is fighting malware on Gmail with machine learning

#artificialintelligence

Email still reigns as one of the most popular places to target users and their data, mainly through the form of malicious attachments. Widely-used email clients such as Outlook, Apple Mail, and Gmail have been great at filtering out emails laced with these types of malware, but Google wants to do even better. Google opts in for more security -- In a blog post on Wednesday, Google announced its expansion with AI to better improve detection and detailed its process for spotting malicious documents. According to the company, 58 percent of all malware targeting Gmail users is spread in this way. The majority of these malicious files, Google says, are Microsoft Office documents.


MIT discovers a powerful antibiotic using machine learning

#artificialintelligence

Massachusetts Institute of Technology (MIT) researchers have discovered a powerful antibiotic compound using their machine-learning algorithm to counter many of the world's deadliest bacteria, including some strains that are immune to all known antibiotics. It prevented infections in two different mouse models, according to MIT official release. An advanced computer model that can screen more than a hundred million chemical compounds was used to design potential antibiotics that can kill dangerous bacteria. Speaking about the discovery, James Collins, the Termeer Professor of Medical Engineering and Science at MIT stated in a press release: "We wanted to develop a platform that would allow us to harness the power of artificial intelligence to usher in a new age of antibiotic drug discovery. He added that the researchers at MIT revealed this "amazing" molecule which is arguably one of the most potent antibiotics that has ever been discovered.


NLP Interview Questions

#artificialintelligence

It's one thing to practice NLP and another to crack interviews. Giving an interview for NLP role is very different from a generic data science profile. In just a few years, the questions have changed completely because of transfer learning and new language models. I have personally experienced that NLP interviews are getting tough with time as we make more progress. Earlier, it was all about SGD, naive-bayes and LSTM, but now its more about LAMB, transformer and BERT.


Predicting how well neural networks will scale

#artificialintelligence

For all the progress researchers have made with machine learning in helping us doing things like crunch numbers, drive cars and detect cancer, we rarely think about how energy-intensive it is to maintain the massive data centers that make such work possible. Indeed, a 2017 study predicted that, by 2025, internet-connected devices would be using 20 percent of the world's electricity. The inefficiency of machine learning is partly a function of how such systems are created. Neural networks are typically developed by generating an initial model, tweaking a few parameters, trying it again, and then rinsing and repeating. But this approach means that significant time, energy and computing resources are spent on a project before anyone knows if it will actually work.


Deep Learning Market Size 2020 Global Industry Share, Top Players, Opportunities And Forecast To 2026 – Mathematics Market Methods

#artificialintelligence

Deep Learning Market report to study and analyses the market size (Consumption, Value, Volume and Production) By Company, Key Regions, Products and End User/Application, Deep Learning market breakdown data from 2014 to 2019, and 6 year forecast from 2020 to 2026. Bedsides Deep Learning industry research report enriched on worldwide competition by topmost prime manufactures (Amazon Web Services (AWS), Google, IBM, Intel, Micron Technology, Microsoft, Nvidia, Qualcomm, Samsung Electronics, Sensory Inc., Skymind, Xilinx, AMD, General Vision, Graphcore, Mellanox Technologies, Huawei Technologies, Fujitsu, Baidu, Mythic, Adapteva, Inc., Koniku) which providing information such as Company Profiles, Gross, Gross Margin, Capacity, Product Picture and Specification, Production, Price, Cost, Revenue and contact information.Deep Learning Market report provide the in-depth analysis of key factors influencing the growth of the market (Growth Potential, Opportunities, Drivers, Industry-Specific Challenges and Risks). The Latest Deep Learning Industry Data Included in this Report: Deep Learning Market Size & Analysis (2014 – 2026); Deep Learning Market Volume & Future Trends (2014 – 2026); Deep Learning Market; By Geography (Volume and Value); 2014 – 2026; Deep Learning Market Opportunity Assessment (2014 – 2026); Deep Learning (Installed Base) Market Share: By Company; Major Deals in Deep Learning Market; Deep Learning Reimbursement Scenario; Deep Learning Current Applications; Deep Learning Competitive Analysis: By Company; Key Market Drivers and Inhibitors; Major Companies Analysis. Scope of Deep Learning Market: The deep learning market has been segmented on the basis of offerings, applications, end-user industries, and geographies. In terms of offerings, software holds the largest share of the deep learning market.


Getting Started with AutoKeras

#artificialintelligence

One of the most powerful upcoming concepts which I wrote about in The State of AI in 2020 is Neural Architecture Search(NAS). There is plenty to know about NAS, but to understand this tutorial I will only summarize. In short, NAS is essentially a method to take the limitations of human design out of Neural Network architectures. To accomplish this, many different architectures are considered in parallel, trained, and evaluated. Following this each may be adjusted based on a selected algorithm to try another architecture.


WeightWatcher: Empirical Quality Metrics for Deep Neural Networks

#artificialintelligence

We introduce the weightwatcher (ww), a python tool for a python tool for computing quality metrics of trained, and pretrained, Deep Neural Netwworks. This blog describes how to use the tool in practice; see our most recent paper for even more details. The summary contains the Power Law exponent (), as well as several log norm metrics, as explained in our papers, and below. Each value represents an empirical quality metric that can be used to gauge the gross effectiveness of the model, as compared to similar models. We can use these metrics to compare models across a common architecture series, such as the VGG series, the ResNet series, etc. These can be applied to trained models, pretrained models, and/or even fine-tuned models.


Hydra -- A fresh look at configuration for machine learning projects

#artificialintelligence

Your code is more complicated than you think. One of the first things every software developer learns about is the command-line. At its core, the command-line is a list of strings that are typically broken down into flags (e.g., -- verbose) and arguments (e.g., -- port 80). This is enough for many simple applications. You can define 2 to 3 command-line arguments in a command-line interface (CLI) parsing library, and you are done.


RSAC 2020: Lack of Machine Learning Laws Open Doors To Attacks

#artificialintelligence

SAN FRANCISCO – As companies quickly adopt machine learning systems, cybercriminals are close behind scheming to compromise them. That worries legal experts who say a lack of laws swing open the door for bad guys to attack systems. During a panel session at RSA Conference 2020 this week, Cristin Goodwin, the assistant general counsel with Microsoft, said the number of machine learning related U.S. court cases is a mere 52. She noted most were related to patents, workplace discrimination and even gerrymandering. Few court cases addressed actual cyberattacks on machine learning systems – demonstrating a dangerous dearth in legal precedent around the technology.


libmolgrid: Graphics Processing Unit Accelerated Molecular Gridding for Deep Learning Applications

#artificialintelligence

We describe libmolgrid, a general-purpose library for representing three-dimensional molecules using multidimensional arrays of voxelized molecular data. It was designed for seamless integration with popular deep learning frameworks and features optimized performance by leveraging graphics processing units (GPUs).