Goto

Collaborating Authors

Statistical Learning


Introduction to Machine Learning in R

#artificialintelligence

This course material is aimed at people who are already familiar with ... What you'll learn This course is about the fundamental concepts of machine learning, facusing on neural networks. This topic is getting very hot nowadays because these learning algorithms can be used in several fields from software engineering to investment banking. Learning algorithms can recognize patterns which can help detect cancer for example. We may construct algorithms that can have a very good guess about stock prices movement in the market.


Clustering using k-Means with implementation

#artificialintelligence

Clustering is a technique to find natural groups in the data. If we show the above picture to a kid, he can identify that there are four types of animals. He may not know the names of all of them, but he can still identify that there are four different types and he can do this independently, without the help of an adult. As we don't need an adult to supervise, clustering is an unsupervised technique. The three motivations can be listed as follows.


How to use Deep Learning for Time Series Forecasting

#artificialintelligence

For a long time, I heard that the problem of time series could only be approached by statistical methods (AR[1], AM[2], ARMA[3], ARIMA[4]). These techniques are generally used by mathematicians who try to improve them continuously to constrain stationary and non-stationary time series. A friend of mine (mathematician, professor of statistics, and specialist in non-stationary time series) offered me several months ago to work on the validation and improvement of techniques to reconstruct the lightcurve of stars. Indeed, the Kepler satellite[11], like many other satellites, could not continuously measure the intensity of the luminous flux of nearby stars. The Kepler satellite was dedicated between 2009 and 2016 to search for planets outside our Solar System called extrasolar planets or exoplanets. As you have understood, we are going to travel a little further than our planet Earth and deep dive into a galactic journey whose machine learning will be our vessel.


8 Clustering Algorithms in Machine Learning that All Data Scientists Should Know

#artificialintelligence

There are three different approaches to machine learning, depending on the data you have. You can go with supervised learning, semi-supervised learning, or unsupervised learning. In supervised learning you have labeled data, so you have outputs that you know for sure are the correct values for your inputs. That's like knowing car prices based on features like make, model, style, drivetrain, and other attributes. With semi-supervised learning, you have a large data set where some of the data is labeled but most of it isn't. This covers a large amount of real world data because it can be expensive to get an expert to label every data point.


Mean Average Precision for Clients

#artificialintelligence

Disclaimer: This project was created for my clients because it's rather challenging to explain such a complex metric simply, therefore don't expect to see much of math or equations here, and please remember that I try to keep it simple. Accuracy is the most vanilla metric out there. Imagine we are doing classification of whether there is a dog in a picture. In order to test our classifier, we prepare a test set with pictures of both containing dogs and not. We then apply our classifier to every picture and get the predicted classes.


Predictive maintenance and decision support systems in heavy industry

#artificialintelligence

Digital transformation is one of the top priorities for industrial companies. The largest players are already moving in this direction, for many years continuously working to improve production efficiency and launching large-scale optimisation programs. They're called advanced analytics or digital innovation, and at their core, the technology could be summarised under artificial intelligence. In all cases, the efforts to utilise AI models or data analytics systems are part of a bigger digital transformation effort of the progressing companies. In an industrial context, such strategies for cost-saving and process optimisation often start from pilot projects, or top management directives for digital change guide them. In general, changes in processes or investments in capital-intensive and competitive industries require large sums of money. Traditional capital expenditures usually stretch over a long period, so a current financial standing may not allow for a complete physical overhaul of the plants or facilities. These high costs lead to the search for cheaper alternatives.


12 Cool Data Science Projects Ideas for Beginners and Experts

#artificialintelligence

Chatbots play a pivotal role for businesses as they can effortlessly handle a barrage of customer queries and messages without any slowdown. They have single-handedly reduced the customer service workload for us by automating a majority of the process. They do this by utilizing techniques backed with Artificial Intelligence, Machine Learning, and Data Science. Chatbots work by analyzing the input from the customer and replying with an appropriate mapped response. To train the chatbot, you can use Recurrent Neural Networks with the intents JSON dataset while the implementation can be handled using Python.


Image Segmentation Using Python Libraries and K-Means.

#artificialintelligence

We all are well aware of the capability of python libraries in machine learning, but you can also manipulate the image properties with the help of the matplotlib library. We will see how we can easily transform and manipulate image properties. We have imported the required libraries, Also cv2 is used to read and manipulate the image. Make sure that the image you want to manipulate lies in the same folder. As we know, image properties are generally in 3-dimension, it is very clear with the output we got (350, 525, 3).


Tool Review: Can FeatureTools simplify the process of Feature Engineering?

#artificialintelligence

Feature Engineering is a crucial step in many machine learning projects, but can be difficult and time consuming if you aren't already deeply familiar with the data and/or domain. So when I came across the FeatureTools framework, which promises to make Feature Engineering faster and easier, I was excited to try it out. FeatureTools allows you to setup Entities and relationships in your data and can then automatically generate tens to hundreds of new features for you. I jumped in to try to use FeatureTools on the Ames Housing Data set, which seemed ideal for Feature Engineering. However, I was getting some strange results and so decided to back off and try it out on the much simpler Titanic data set.


PostgreSQL and Machine Learning

#artificialintelligence

I will show you how to apply Machine Learning algorithms on data from the PostgreSQL database to get insights and predictions. I will use an Automated Machine Learning (AutoML) supervised. It is an open-source python package. Thanks to AutoML I will get quick access to many ML algorithms: Decision Tree, Logistic Regression, Random Forest, Xgboost, Neural Network. The AutoML will handle feature engineering as well.