Goto

Collaborating Authors

Neural Networks


Addition and Subtraction using Recurrent Neural Networks.

#artificialintelligence

How does google understand how to translate '今日はどうですか?' to'How are you doing today?' or vice versa? How do we get to predict a disease spread such as COVID-19 way into the future beforehand? How do automatic Text generation or Text Summarization mechanisms work? The answer is Recurrent Neural Networks. RNNs have been the solution to deal with most problems in Natural language Processing and not only NLP but in Bio-informatics, Financial Forecasting, Sequence modelling etc.


Supervised vs Unsupervised & Discriminative vs Generative

#artificialintelligence

Highlights: GANs and classical Deep Learning methods (classification, object detection) are similar, but they are also fundamentally different in nature. Reviewing their properties will be the topic of this post. Therefore, before we proceed further with the GANs series, it will be useful to refresh and recap what is supervised and unsupervised learning. In addition, we will explain the difference between discriminative and generative models. Finally, we will introduce latent variables, since they are an important concept in GANs.


Deepmind Researchers Propose 'ReLICv2': Pushing The Limits of Self-Supervised ResNets

#artificialintelligence

The supervised learning architectures generally require a massive amount of labeled data. Acquiring this vast amount of high-quality labeled data can turn out to be a very costly and time-consuming task. The main idea behind self-supervised methods in deep learning is to learn the patterns from a given set of unlabelled data and fine-tune the model with few labeled data. Self-supervised learning using residual networks has recently progressed, but they still underperform by a large margin corresponding to supervised residual network models on ImageNet classification benchmarks. This poor performance has rendered the use of self-supervised models in performance-critical scenarios till this point.


COVID-19 detection in CT and CXR images using deep learning models

#artificialintelligence

Infectious diseases pose a threat to human life and could affect the whole world in a very short time. Corona-2019 virus disease (COVID-19) is an example of such harmful diseases. COVID-19 is a pandemic of an emerging infectious disease, called coronavirus disease 2019 or COVID-19, caused by the coronavirus SARS-CoV-2, which first appeared in December 2019 in Wuhan, China, before spreading around the world on a very large scale. The continued rise in the number of positive COVID-19 cases has disrupted the health care system in many countries, creating a lot of stress for governing bodies around the world, hence the need for a rapid way to identify cases of this disease. Medical imaging is a widely accepted technique for early detection and diagnosis of the disease which includes different techniques such as Chest X-ray (CXR), Computed Tomography (CT) scan, etc.


TinyML is bringing deep learning models to microcontrollers

#artificialintelligence

This article is part of our reviews of AI research papers, a series of posts that explore the latest findings in artificial intelligence. Deep learning models owe their initial success to large servers with large amounts of memory and clusters of GPUs. The promises of deep learning gave rise to an entire industry of cloud computing services for deep neural networks. Consequently, very large neural networks running on virtually unlimited cloud resources became very popular, especially among wealthy tech companies that can foot the bill. But at the same time, recent years have also seen a reverse trend, a concerted effort to create machine learning models for edge devices.


The importance of invariance in AI 🤖

#artificialintelligence

Compared to computers, humans and most other vertebrates (including some invertebrates), can learn internal representations of things, such as objects, or concepts, unbelievably fast. Instead of requiring millions of labeled data points, a toddler will understand the concept of a chair with only a handful of examples. How? Do most organisms have a large set of hard-coded procedures encoded in their neural circuitry, that were created and accumulated overtime through evolutionary forces? Considering the evidence, this seems to be very unlikely. We know that organisms do have some hard-coded memories that influence their behaviors and actions, but the number of such procedures is limited.


Insurance 2030--The impact of AI on the future of insurance

#artificialintelligence

Welcome to the future of insurance, as seen through the eyes of Scott, a customer in the year 2030. Upon hopping into the arriving car, Scott decides he wants to drive today and moves the car into "active" mode. Scott's personal assistant maps out a potential route and shares it with his mobility insurer, which immediately responds with an alternate route that has a much lower likelihood of accidents and auto damage as well as the calculated adjustment to his monthly premium. Scott's assistant notifies him that his mobility insurance premium will increase by 4 to 8 percent based on the route he selects and the volume and distribution of other cars on the road. It also alerts him that his life insurance policy, which is now priced on a "pay-as-you-live" basis, will increase by 2 percent for this quarter. The additional amounts are automatically debited from his bank account. When Scott pulls into his destination's parking lot, his car bumps into one of several parking signs.


How to Regulate Artificial Intelligence the Right Way: State of AI and Ethical Issues

#artificialintelligence

It is critical for governments, leaders, and decision makers to develop a firm understanding of the fundamental differences between artificial intelligence, machine learning, and deep learning. Artificial intelligence (AI) applies to computing systems designed to perform tasks usually reserved for human intelligence using logic, if-then rules, and decision trees. AI recognizes patterns from vast amounts of quality data providing insights, predicting outcomes, and making complex decisions. Machine learning (ML) is a subset of AI that utilises advanced statistical techniques to enable computing systems to improve at tasks with experience over time. Chatbots like Amazon's Alexa and Apple's Siri improve every year thanks to constant use by consumers coupled with the machine learning that takes place in the background.


One-Shot on-Device Learning for Image Classifiers Using Classification-by-Retrieval

#artificialintelligence

Classification-by-retrieval is a simple method for developing a neural network-based classifier that does not require computationally intensive backpropagation training. This technology can be used to create a lightweight mobile model with as little as one picture per class or an on-device model that can classify tens of thousands of categories. For example, mobile models can recognize tens of thousands of landmarks using classification-by-retrieval technology. Image recognition is divided into two methods: classification and retrieval. A common technique to object recognition is to construct a neural network classifier and train it using a considerable quantity of training data (often thousands of images or more).


Harnessing noise in optical computing for AI

#artificialintelligence

Artificial intelligence and machine learning are currently affecting our lives in many small but impactful ways. For example, AI and machine learning applications recommend entertainment we might enjoy through streaming services such as Netflix and Spotify. In the near future, it's predicted that these technologies will have an even larger impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries. But the computers used for AI and machine learning demand a lot of energy. Currently, the need for computing power related to these technologies is doubling roughly every three to four months.