Goto

Collaborating Authors

Statistical Learning


Dialysis adequacy predictions using a machine learning method - Scientific Reports

#artificialintelligence

Dialysis adequacy is an important survival indicator in patients with chronic hemodialysis. However, there are inconveniences and disadvantages to measuring dialysis adequacy by blood samples. This study used machine learning models to predict dialysis adequacy in chronic hemodialysis patients using repeatedly measured data during hemodialysis. This study included 1333 hemodialysis sessions corresponding to the monthly examination dates of 61 patients. Patient demographics and clinical parameters were continuously measured from the hemodialysis machine; 240 measurements were collected from each hemodialysis session. Machine learning models (random forest and extreme gradient boosting [XGBoost]) and deep learning models (convolutional neural network and gated recurrent unit) were compared with multivariable linear regression models. The mean absolute percentage error (MAPE), root mean square error (RMSE), and Spearman’s rank correlation coefficient (Corr) for each model using fivefold cross-validation were calculated as performance measurements. The XGBoost model had the best performance among all methods (MAPE = 2.500; RMSE = 2.906; Corr = 0.873). The deep learning models with convolutional neural network (MAPE = 2.835; RMSE = 3.125; Corr = 0.833) and gated recurrent unit (MAPE = 2.974; RMSE = 3.230; Corr = 0.824) had similar performances. The linear regression models had the lowest performance (MAPE = 3.284; RMSE = 3.586; Corr = 0.770) compared with other models. Machine learning methods can accurately infer hemodialysis adequacy using continuously measured data from hemodialysis machines.


Linear Regression in Python

#artificialintelligence

Forecasting in general means to display, where this exactly is to display or predict future trends using previous or historical data as inputs to obtain an efficient and effective estimation from the predictive data. Forecasting models have different methods for different situations and evaluation procedures are also conducted. Forecasting evaluation includes a procedure to be carried out in step by step that starts with testing of assumptions, testing data and methods, replicating outputs, and accessing outputs. There are three different types of forecasting which basic types of forecasting are: qualitative techniques, time series analysis and projection, and casual models. In this course you will be introduced to Linear Regression in Python, Importing Libraries, Graphical Univariate Analysis, Boxplot, Linear Regression Boxplot, Linear Regression Outliers, Bivariate Analysis, Machine Learning Base Run and Predicting Output.


House Price Forecasting using Zillow Economics dataset

#artificialintelligence

In the previous blog, we discussed a predictive model for house prices using Machine Learning algorithms. In this blog, we are going to discuss the time series forecasting on Zillow economics data using a statistical modeling approach. The project was implemented in September 2019 and forecasting of house prices was done for the next year that is 2020. The code could be reused by changing the span of forecasting that is year for forecasting or duration of forecasting. The results discussed in this blog are for the year 2020.


Monte Carlo Markov Chain (MCMC) explained

#artificialintelligence

MCMC methods are a family of algorithms that uses Markov Chains to perform Monte Carlo estimate. The name gives us a hint, that it is composed of two components -- Monte Carlo and Markov Chain. Let us understand them separately and in their combined form. Monte Carlo method derives its name from a Monte Carlo casino in Monaco. It is a technique for sampling from a probability distribution and using those samples to approximate desired quantity. In other words, it uses randomness to estimate some deterministic quantity of interest.


Advanced K-Means: Controlling Groups Sizes and Selecting Features

#artificialintelligence

The algorithm uses ideas from Linear Programming, in particular Network Models. Networks models are used, among other things, in logistics to optimise the flow of goods across a network of roads. We can see in the simple figure above that we have 5 nodes with directed arcs (the arrows) between them. Each node has a demand (negative) or supply (positive) value and the arcs have flow and cost values. For instance, the arc 2–4 has a flow of 4 and a cost of $2. Similarly, node 1 supplies 20 units and node 4 requires 5 units.


Predicting The Wind Speed Using K-Neighbors Classifier

#artificialintelligence

Hope you all are doing well in this hard time of the Covid era. In this article, we are going to predict the wind speed of the current date and time for any given latitude and longitude coordinates. We'll be using a K-neighbors classifier to build our predicting model. The dataset we are using is available on GitHub here. The first step which I always suggest is to check the python version which you are using.


Evolution Of Natural Language Processing(NLP)

#artificialintelligence

In this article I want to share about the evolution of text analysis algorithms in last decade. Natural Language(NLP)has been around for a long time, In fact, a very simple bag of words model was introduced in the 1950s. But in this article I want to focus on evolution of NLP during recent times. There has been enormous progress in the field since 2013 due to the evolution and the advancement of machine learning algorithms together with reduced cost of computation and memory. In 2013, a research team led by Thomas Michael off at Google introduced the Word2Vec algorithm.


Series of Projects on Data Science and Data Analytics in Both Python and R

#artificialintelligence

The best way to learn is to do projects. Especially for self-learners who learned some programming and tools of data science and wondering what next. I suggest, start doing some projects of your own. Also, for a beginner who has no industry experience, some practice projects can make a good portfolio. But for a beginner, it may be hard to think of a good project idea.


Dimensionality Reduction using an Autoencoder in Python

#artificialintelligence

Dimensionality is the number of input variables or features for a dataset and dimensionality reduction is the process through which we reduce the number of input variables in a dataset. A lot of input features makes predictive modeling a more challenging task. When dealing with high dimensional data, it is often useful to reduce the dimensionality by projecting the data to a lower dimensional subspace which captures the "essence" of the data. This is called dimensionality reduction. "dimensionality reduction yields a more compact, more easily interpretable representation of the target concept, focusing the user's attention on the most relevant variables."


Top 70+ Data Science Interview Questions and Answers for 2021

#artificialintelligence

We can see Pr value here, and there are three stars associated with this Pr value. This basically means that we can reject the null hypothesis which states that there is no relationship between the age and the target columns. But since we have three stars over here, this null hypothesis can be rejected. There is a strong relationship between the age column and the target column. Now, we have other parameters like null deviance and residual deviance.