Deep Learning


Deep Learning, Knowledge Representation and Reasoning

Journal of Artificial Intelligence Research

The recent success of deep neural networks at tasks such as language modelling, computer vision, and speech recognition has attracted considerable interest from industry and academia. Achieving a better understanding and widespread use of such models involves the use of Knowledge Representation and Reasoning together with sound Machine Learning methodologies and systems. The goal of this special track, which closed in 2017, was to serve as a home for the publication of leading research in deep learning towards cognitive tasks, focusing on applications of neural computation to advanced AI tasks requiring knowledge representation and reasoning.


Artificial Intelligence A-Z : Learn How To Build An AI

#artificialintelligence

Combine the power of Data Science, Machine Learning and Deep Learning to create powerful AI for Real-World applications! Your CCNA start Deep Learning A-Z: Hands-On Artificial Neural Networks Deep Learning and Computer Vision A-Z: OpenCV, SSD & GANs Artificial Intelligence for Business ZERO to GOD Python 3.8 FULL STACK MASTERCLASS 45 AI projects Comment Policy: Please write your comments that match the topic of this page's posts. Comments that contain links will not be displayed until they are approved.


TensorFlow deepens its advantages in the AI modeling wars

#artificialintelligence

TensorFlow remains the dominant AI modeling framework. Most AI (artificial intelligence) developers continue to use it as their primary open source tool or alongside PyTorch, in which they develop most of their ML (machine learning), deep learning, and NLP (natural language processing) models. In the most recent O'Reilly survey on AI adoption in the enterprise, more than half of the responding data scientists cited TensorFlow as their primary tool. This finding is making me rethink my speculation, published just last month, that TensorFlow's dominance among working data scientists may be waning. Neverthless, PyTorch remains a strong second choice, having expanded its usage in the O'Reilly study to more than 36 percent of respondents, up from 29 percent in the previous year's survey.


How To Crack Google TensorFlow Developer Certificate Exam

#artificialintelligence

Google Brain had recently launched the TensorFlow Developer Certificate program which would enable machine learning (ML) enthusiasts to demonstrate their skills in using TensorFlow to solve deep learning and ML problems. According to the blog post, the goal of this certificate is to provide them with the opportunity to showcase their expertise in ML in an increasingly AI-driven job market. TensorFlow is one of the popular open-source libraries in ML which provides a suitable abode with essential tools for ML researchers and developers to perform SOTA ML applications. The developers at Google Brain claim that this is intended as a foundational certificate for students, developers, and data scientists who want to demonstrate practical ML skills through building and training of models using TensorFlow. Currently, this is a level one certificate exam which tests a developer's foundational knowledge of integrating ML into tools and applications.


r/MachineLearning - [P] Mimicry: PyTorch library for reproducibility in GAN research.

#artificialintelligence

Hi everyone, I've recently built Mimicry, a PyTorch library for GANs which I hope can make GAN research findings more reproducible. The general idea is to have an easily accessible set of implementations (that reproduce the original scores as closely as possible), baseline scores for comparisons, and metrics for GANs which researchers can quickly use to produce results and compare. For reproducibility, I re-implemented the original models and verified their correctness by checking their scores against the reported ones under the same training and evaluation conditions. On the metrics part, to ensure backward compatibility of existing scores, I adopted the original TensorFlow implementations of Inception Score, FID, and KID so new scores produced can be compared with other works directly. I've also included a tutorial to implement a more sophisticated GAN like Self-supervised GAN (SSGAN) from the ground up, again with a focus on reproducing the results.


Deploying a Deep Learning Model using Flask

#artificialintelligence

I am creating the web deployment for a book I am writing for Manning Publications on deep learning with structured data. The audience for this book is interested in how to deploy a simple deep learning model. They need a deployment example that is straightforward and doesn't force them to wade through a bunch of web programming details. For this reason, I wanted a web deployment solution that kept as much of the coding as possible in Python. With this in mind, I looked at two Python-based options for web deployment: Flask and Django.


Combine LSTM and VAR for Multivariate Time Series Forecasting

#artificialintelligence

In a classical time series forecasting task, the first standard decision when modeling involves the adoption of statistical methods or other pure machine learning models, including three based algorithms or deep learning techniques. The choice is strongly related to the problem we are carrying out but in general: statistical techniques are adequate when we face an autoregressive problem when the future is related only to the past; while machine learning models are suitable for more complex situations when it's also possible to combine variegated data sources. In this post, I try to combine the ability of the statistical method to learn from experience with the generalization of deep learning techniques. Our task is a multivariate time series forecasting problem, so we use the multivariate extension of ARIMA, known as VAR, and a simple LSTM structure. We don't produce an ensemble model; we use the ability of VAR to filter and study history and provide benefit to our neural network in predicting the future.


Where you should drop Deep Learning in favor of Constraint Solvers

#artificialintelligence

Machine Learning and Deep Learning are ongoing buzzwords in the industry. Branding ahead of functionalities led to Deep Learning being overused in many artificial intelligence applications. This post will provide a quick grasp at constraint satisfaction, a powerful yet underused approach which can tackle a large number of problems in AI and other areas of computer science, from logistics and scheduling to temporal reasoning and graph problems. Let's consider a factual and highly topical problem. Hospitals must organize quickly to treat ill people.


Google is using machine learning to improve the quality of Duo calls

#artificialintelligence

Google has rolled out a new technology to improve audio quality in Duo calls when the service can't maintain a steady connection called WaveNetEQ. It's based on technology from Google's DeepMind division that aims to replace audio jitter with artificial noise that sounds just like human speech, generated using machine learning. If you've ever made a call over the internet, chances are you've experienced audio jitter. It happens when packets of audio data sent as part of the call get lost along the way or otherwise arrive late or in the wrong order. Google says that 99 percent of Duo calls experience packet loss: 20 percent of these lose over 3 percent of their audio, and 10 percent lose over 8 percent.


PyTorch: Deep Learning and Artificial Intelligence

#artificialintelligence

PyTorch: Deep Learning and Artificial Intelligence new udemy course Artificial Intelligence (AI) continues to grow in popularity and disrupt a wide range of domains, but it is a complex and daunting topic. In this book, you'll get to grips with building deep learning apps, and how you can use PyTorch for research and solving real-world problems. What you'll learn Artificial Neural Networks (ANNs) / Deep Neural Networks (DNNs) Predict Stock Returns Time Series Forecasting How to build a Deep Reinforcement Learning Stock Trading Bot GANs (Generative Adversarial Networks) Convolutional Neural Networks (CNNs) Recurrent Neural Networks (RNNs) Natural Language Processing (NLP) with Deep Learning Demonstrate Moore's Law using Code Transfer Learning to create state-of-the-art image classifiers Description Welcome to PyTorch: Deep Learning and Artificial Intelligence! Although Google's Deep Learning library Tensorflow has gained massive popularity over the past few years, PyTorch has been the library of choice for professionals and researchers around the globe for deep learning and artificial intelligence. Is it possible that Tensorflow is popular only because Google is popular and used effective marketing?