Goto

Collaborating Authors

transfer learning


Big Tech & Their Favourite Deep Learning Techniques

#artificialintelligence

Interestingly, they all seem to have picked a particular school of thought in deep learning. With time, this pattern is becoming more and more clear. For instance, Facebook AI Research (FAIR) has been championing self-supervised learning (SSL) for quite some time, alongside releasing relevant papers and tech related to computer vision, image, text, video, and audio understanding. Even though many companies and research institutions seem to have their hands on every possible area within deep learning, a clear pattern is emerging. But, of course, all of them have their favourites. In this article, we will explore some of the recent work in their respective niche/popularised areas.


How does Transfer Learning work?

#artificialintelligence

The simple idea of transfer learning is, After Neural Network learned from one task, apply that knowledge to another related task. It is a powerful idea in Deep Learning. You all know in Computer vision and Natural Language Processing tasks required high computational costs and time. So, we can simplify those tasks using Transfer Learning. For example, after we trained a model using images to classify Cars, then that model we can use to recognize other vehicles like trucks.


How to create a real-time Face Detector

#artificialintelligence

In this article, I will show you how to write a real-time face detector using Python, TensorFlow/Keras and OpenCV. All code is available in this repo. You can also read this tutorial directly on GitLab. Python code is highlighted there, so it is more convenient to read. First, in Theoretical Part I will tell you a little about the concepts that will be useful for us (Transfer Learning and Data Augmentation), and then I will go to the code analysis in the Practical Part section. Note, that you must have tensorflow and opencv libraries installed to run this code.


Chapter 3 : Transfer Learning with ResNet50 -- from Dataloaders to Training

#artificialintelligence

I was given Xray baggage scan images by an airport to develop a model that performs automatic detection of dangerous objects (gun and knife). Given only a small amount of Xray images, I am using Domain Adaptation by first collecting a large number of normal (non-Xray) images of dangerous objects from the internet, training a model using only those normal images, then adapting the model to perform well on Xray images. In my previous post, I talked about iterative data collection process for web images of gun and knife to be used for domain adaptation. In this post, I will discuss transfer learning with ResNet50 using the scraped web images. For now, we won't worry about the Xray images and only focus on training the model with the web images. To read this post, it's recommended to have some knowledge about how to apply transfer learning using a model pre-trained on ImageNet in PyTorch. I won't explain every step in detail, but will share some useful tips that can answer questions like:


Hot dog or Not Hot dog

#artificialintelligence

Have you watched the "Silicon Valley" comedy series of HBO? If so, I bet you remember the Not Hotdog app that Jian Yang developed. Here is a clip to refresh your memory. So basically this app identifies whether something is Hot dog or not. Well, we can train with other types of objects to identify them as well.


Hot papers on arXiv from the past month: August 2021

AIHub

Reproduced under a CC BY 4.0 license. Here are the most tweeted papers that were uploaded onto arXiv during August 2021. Results are powered by Arxiv Sanity Preserver. How to avoid machine learning pitfalls: a guide for academic researchers Michael A. Lones Submitted to arXiv on: 5 August 2021 Abstract: This document gives a concise outline of some of the common mistakes that occur when using machine learning techniques, and what can be done to avoid them. It is intended primarily as a guide for research students, and focuses on issues that are of particular concern within academic research, such as the need to do rigorous comparisons and reach valid conclusions.


What is Transfer Learning? -- Idiot Developer

#artificialintelligence

Transfer Learning is a technique in machine learning where we reuse a pre-trained model to solve a different but related problem. It is one of the popular methods to train the deep neural network. It is generally used for image classification tasks where the amount of the dataset is small. In this article, we will go through what transfer learning is, how it works and the advantages it offers. Additionally, we will also cover the most common problems related to it.


Image Captioning and Tagging Using Deep Learning Models

#artificialintelligence

Technologies applied to turning the sequence of pixels depicted on the image into words with Artificial Intelligence aren't as raw as five or more years ago. This technology could help blind people to discover the world around them. This article covers use cases of image captioning technology, its basic structure, advantages, and disadvantages. Also, we deploy a model capable of creating a meaningful description of what is displayed on the input image. As a vision-language objective, image captioning could be solved with the help of computer vision and NLP.


Exploring Deep Learning for Image Captioning and Tagging

#artificialintelligence

Technologies applied to turning the sequence of pixels depicted on the image into words with Artificial Intelligence aren't as raw as five or more years ago. This technology could help blind people to discover the world around them. This article covers use cases of image captioning technology, its basic structure, advantages, and disadvantages. Also, we deploy a model capable of creating a meaningful description of what is displayed on the input image. As a vision-language objective, image captioning could be solved with the help of computer vision and NLP.


EENLP: Cross-lingual Eastern European NLP Index

arXiv.org Artificial Intelligence

This report presents the results of the EENLP project, done as a part of EEML 2021 summer school. It presents a broad index of NLP resources for Eastern European languages, which, we hope, could be helpful for the NLP community; several new hand-crafted cross-lingual datasets focused on Eastern European languages, and a sketch evaluation of cross-lingual transfer learning abilities of several modern multilingual Transformer-based models.