Goto

Collaborating Authors

Results


Out-and-Out in Artificial Neural Networks with Keras

#artificialintelligence

When I started reading articles on neural networks, I faced a lot of struggles to understand the basics behind neural networks and how they work. Start reading more and more articles on the internet, grab those key points, and put them together into private notes for me. And, I thought to publish them for better understandings to others. It would be fun to know the basics of any domain. The perceptron is one of the simplest ANN Architectures, invented in 1957 by Frank Rosenblatt.


Communication Algorithm-Architecture Co-Design for Distributed Deep Learning

#artificialintelligence

Abstract--Large-scale distributed deep learning training has enabled developments of more complex deep neural network models to learn from larger datasets for sophisticated tasks. In particular, distributed stochastic gradient descent intensively invokes all-reduce operations for gradient update, which dominates communication time during iterative training epochs. In this work, we identify the inefficiency in widely used allreduce algorithms, and the opportunity of algorithm-architecture co-design. We propose MULTITREE all-reduce algorithm with topology and resource utilization awareness for efficient and scalable all-reduce operations, which is applicable to different interconnect topologies. Moreover, we co-design the network interface to schedule and coordinate the all-reduce messages for contention-free communications, working in synergy with the algorithm. The flow control is also simplified to exploit the bulk data transfer of big gradient exchange. We evaluate the co-design using different all-reduce data sizes for synthetic study, demonstrating its effectiveness on various interconnection network topologies, in addition to state-of-the-art deep neural networks for real workload experiments. The results show that MULTITREE achieves 2.3 and 1.56 communication speedup, as well as up to 81% and 30% training time reduction compared to ring all-reduce and state-of-the-art approaches, respectively.


Machine Learning & Deep Learning in Python & R

#artificialintelligence

In this section we will learn - What does Machine Learning mean. What are the meanings or different terms associated with machine learning? You will see some examples so that you understand what machine learning actually is. It also contains steps involved in building a machine learning model, not just linear models, any machine learning model.


Lasso (l1) and Ridge (l2) Regularization Techniques

#artificialintelligence

What is the need for Ridge and Lasso Regression? When we create our linear model with the best-fitted line and come on testing phase then because of increased variation, our model is over-fitted, So It will not work well in the future also not provide appropriate accuracy. Therefore, to reduce overfitting, ridge and lasso regression came into the picture. Both are powerful techniques with a slight difference used for creating such models that are efficient and computationally fit to reduce over-fitting. It is a process to classify the classes and provide additional information to prevent over-fitting.


How to Implement Deep Neural Networks for Radar Image Classification

#artificialintelligence

Radar-based recognition and localization of people and things in the home environment has certain advantages over computer vision, including increased user privacy, low power consumption, zero-light operation and more sensor flexible placement. Shallow machine learning techniques such as Support Vector Machines and Logistic Regression can be used to classify images from radar, and in my previous work, Teaching Radar to Understand the Home and Using Stochastic Gradient Descent to Train Linear Classifiers I shared how to apply some of these methods. In this article, you will learn how to develop Deep Neural Networks (DNN)and train them to classify objects in radar images. In addition, you will learn how to use a Semi-Supervised Generative Adversarial Network (SGAN) [1] that only needs a small number of labeled data to train a DNN classifier. This is important in dealing with radar data sets because of the dearth of large training sets, in contrast to those available for camera-based images (e.g., ImageNet) which has helped to make computer vision ubiquitous.


Under the Hood of Modern Machine and Deep Learning

#artificialintelligence

In this chapter, we investigate whether unique, optimal decision boundaries can be found. In order to do so, we first have to revisit several fundamental mathematical principles. Regularization is a mathematical tool, which allows us to find unique solutions even for highly ill-posed problems. In order to use this trick, we review norms and how they can be used to steer regression problems. Rosenblatt's Perceptron and Multi-Layer Perceptrons which are also called Artificial Neural Networks inherently suffer from this ill-posedness.


10 Steps to Master Machine Learning with Python

#artificialintelligence

Machine learning is one of the most popular buzzwords right now, and it has grown in popularity over the years. However, there is a scarcity of qualified Machine Learning professionals on the market, so now is an excellent time to begin your career in this area. This article is written to provide you with a step-by-step guide to getting started with machine learning training in Python since it is regarded as the most common programming language for machine learning. Python is a high-level object-oriented programming language that was first introduced in 1991. Python is a very readable and powerful programming language.


Use Machine Learning and GridDB to build a Production-Ready Stock Market Anomaly Detector

#artificialintelligence

In this project, we use GridDB to create a Machine Learning platform where Kafka is used to import stock market data from Alphavantage, a market data provider. Tensorflow and Keras train a model that is then stored in GridDB, and then finally uses LSTM prediction to find anomalies in daily intraday trading history. The last piece is that the data is visualized in Grafana and then we configure GridDB to send notifications via its REST Trigger function to Twilio's Sendgrid. The actual machine learning portion of this project was inspired by posts on Towards Data Science and Curiously. This model and the data flow is also applicable to many other datasets such as predictive maintenance or machine failure prediction or wherever you want to find anomalies in time series data.


Under the Hood of Modern Machine and Deep Learning

#artificialintelligence

In this chapter, we investigate whether unique, optimal decision boundaries can be found. In order to do so, we first have to revisit several fundamental mathematical principles. Regularization is a mathematical tool, which allows us to find unique solutions even for highly ill-posed problems. In order to use this trick, we review norms and how they can be used to steer regression problems. Rosenblatt's Perceptron and Multi-Layer Perceptrons which are also called Artificial Neural Networks inherently suffer from this ill-posedness.


Day 42 - Forecasting a Time Series and Recurrent Neural Network(RNNs) - IT Consultant - SAP, Artificial Intelligence and Machine Learning

#artificialintelligence

One of the most challenging parts is Forecasting a Time Series. If you agree or not, it is pretty hard to predict time-based. If you are working on a project and your task to predict delivery time using already customer-delivered dates. The customer delivery date happens delivery through one truck, and then it is a univariate time series. While in the customer delivery date based on multiple mode truck, ship, airplane, so it is multivariate time series.