machinelearningmastery
Deep Learning with PyTorch (9-Day Mini-Course) - MachineLearningMastery.com Deep Learning with PyTorch (9-Day Mini-Course) - MachineLearningMastery.com
Deep learning is a fascinating field of study and the techniques are achieving world class results in a range of challenging machine learning problems. It can be hard to get started in deep learning. Which library should you use and which techniques should you focus on? In this 9-part crash course you will discover applied deep learning in Python with the easy to use and powerful PyTorch library. This mini-course is intended for practitioners that are already comfortable with programming in Python and knows the basic concept of machine learning. This is a long and useful post. You might want to print it out. Photo by Thomas Kinto, some rights reserved.
Using Activation Functions in Deep Learning Models - MachineLearningMastery.com Using Activation Functions in Deep Learning Models - MachineLearningMastery.com
A deep learning model in its simplest form are layers of perceptrons connected in tandem. Without any activation functions, they are just matrix multiplications with limited power, regardless how many of them. Activation is the magic why neural network can be an approximation to a wide variety of non-linear function. In PyTorch, there are many activation functions available for use in your deep learning models. In this post, you will see how the choice of activation functions can impact the model.
Using Dropout Regularization in PyTorch Models - MachineLearningMastery.com Using Dropout Regularization in PyTorch Models - MachineLearningMastery.com
Dropout is a simple and powerful regularization technique for neural networks and deep learning models. In this post, you will discover the Dropout regularization technique and how to apply it to your models in PyTorch models. Dropout is a regularization technique for neural network models proposed around 2012 to 2014. It is a layer in the neural network. During training of a neural network model, it will take the output from its previous layer, randomly select some of the neurons and zero them out before passing to the next layer, effectively ignored them. This means that their contribution to the activation of downstream neurons is temporally removed on the forward pass, and any weight updates are not applied to the neuron on the backward pass.
Building a Multiclass Classification Model in PyTorch - MachineLearningMastery.com Building a Multiclass Classification Model in PyTorch - MachineLearningMastery.com
PyTorch library is for deep learning. Some applications of deep learning models are to solve regression or classification problems. In this tutorial, you will discover how to use PyTorch to develop and evaluate neural network models for multi-class classification problems. In this tutorial, you will use a standard machine learning dataset called the iris flowers dataset. It is a well-studied dataset and good for practicing machine learning.
How to Evaluate the Performance of PyTorch Models - MachineLearningMastery.com How to Evaluate the Performance of PyTorch Models - MachineLearningMastery.com
Designing a deep learning model is sometimes an art. There are a lot of decision points and it is not easy to tell what is the best. One way to come up with a design is by trial and error and evaluating the result on real data. Therefore, it is important to have a scientific method to evaluate the performance of your neural network and deep learning models. In fact, it is also the same method to compare any kind of machine learning models on a particular usage.
Building a Regression Model in PyTorch - MachineLearningMastery.com Building a Regression Model in PyTorch - MachineLearningMastery.com
PyTorch library is for deep learning. Some applications of deep learning models are to solve regression or classification problems. In this post, you will discover how to use PyTorch to develop and evaluate neural network models for regression problems. The dataset you will use in this tutorial is the California housing dataset. This is a dataset that describes the median house value for California districts.
How to Develop a Random Forest Ensemble in Python - MachineLearningMastery.com How to Develop a Random Forest Ensemble in Python - MachineLearningMastery.com
The effect is that the predictions, and in turn, prediction errors, made by each tree in the ensemble are more different or less correlated. When the predictions from these less correlated trees are averaged to make a prediction, it often results in better performance than bagged decision trees. Perhaps the most important hyperparameter to tune for the random forest is the number of random features to consider at each split point. Random forests' tuning parameter is the number of randomly selected predictors, k, to choose from at each split, and is commonly referred to as mtry. In the regression context, Breiman (2001) recommends setting mtry to be one-third of the number of predictors.
Feature Selection For Machine Learning in Python - MachineLearningMastery.com Feature Selection For Machine Learning in Python - MachineLearningMastery.com
The data features that you use to train your machine learning models have a huge influence on the performance you can achieve. Irrelevant or partially relevant features can negatively impact model performance. In this post you will discover automatic feature selection techniques that you can use to prepare your machine learning data in python with scikit-learn. Feature Selection For Machine Learning in Python Photo by Baptiste Lafontaine, some rights reserved. Feature selection is a process where you automatically select those features in your data that contribute most to the prediction variable or output in which you are interested.
Making Linear Predictions in PyTorch - MachineLearningMastery.com Making Linear Predictions in PyTorch - MachineLearningMastery.com
Linear regression is a statistical technique for estimating the relationship between two variables. A simple example of linear regression is to predict the height of someone based on the square root of the person’s weight (that’s what BMI is based on). To do this, we need to find the slope and intercept of the line. […]
Inferencing the Transformer Model - MachineLearningMastery.com Inferencing the Transformer Model - MachineLearningMastery.com
We have seen how to train the Transformer model on a dataset of English and German sentence pairs and how to plot the training and validation loss curves to diagnose the model's learning performance and decide at which epoch to run inference on the trained model. We are now ready to run inference on the trained Transformer model to translate an input sentence. In this tutorial, you will discover how to run inference on the trained Transformer model for neural machine translation. It provides self-study tutorials with working code to guide you into building a fully-working transformer model that can translate sentences from one language to another... Inferencing the Transformer model Photo by Karsten Würth, some rights reserved. Recall having seen that the Transformer architecture follows an encoder-decoder structure.