time series


Linear, Machine Learning and Probabilistic Approaches for Time Series Analysis

@machinelearnbot

In this post, we consider different approaches for time series modeling. The forecasting approaches using linear models, ARIMA alpgorithm, XGBoost machine learning algorithm are described. Results of different model combinations are shown. For probabilistic modeling the approaches using copulas and Bayesian inference are considered. Time series analysis, especially forecasting, is an important problem of modern predictive analytics.


Machine Learning And Artificial Intelligence In Demand Planning

#artificialintelligence

While machine learning and artificial intelligence (AI) have been used in supply chain applications for some time, there is an ongoing arms race to more effectively leverage both machine learning and artificial intelligence in demand planning solutions in new ways. Demand planning is one of the key applications in supply chain planning (SCP) suites. In ARC's recent global market study on this market, demand applications account for just under a third of a $2 billion plus market. And these applications are often the wedge purchase; the SCP solution that is first implemented by a company that then goes on to purchase other solutions in the suite. Machine learning works by taking the output of an application (for example, a forecast), examining that output against some measure of the truth, and then adjusting the parameters or math involved in generating the output (forecast), and seeing if the adjustments lead to more accurate outputs.


AIEVE : A lesson to predict the future -- Steemit

#artificialintelligence

The ultimate aim of AI is to produce more efficient and accurate predictions. The current trend in AI practice is to build deep learning models with TensorFlow or Keras. I have especially seen a lot of interest and research around predicting time series with Long Short-Term Memory neural network models (LSTM), which is a subtype of deep learning. I specialize in the analysis of time series data (a series of observations over time). I am particularly experienced in the utilities sector.


Automated Feature Engineering for Time Series Data

@machinelearnbot

Most machine learning algorithms today are not time-aware and are not easily applied to time series and forecasting problems. Leveraging advanced algorithms like XGBoost, or even linear models, typically requires substantial data preparation and feature engineering – for example, creating lagged features, detrending the target, and detecting periodicity. The preprocessing required becomes more difficult in the common case where the problem requires predicting a window of multiple future time points. As a result, most practitioners fall back on classical methods, such as ARIMA or trend analysis, which are time-aware but less expressive. This article covers the best practices for solving this challenge, by introducing a general framework for developing time series models, generating features and preprocessing the data, and exploring the potential to automate this process in order to apply advanced machine learning algorithms to almost any time series problem.


How to Prepare Univariate Time Series Data for Long Short-Term Memory Networks - Machine Learning Mastery

@machinelearnbot

It can be hard to prepare data when you're just getting started with deep learning. Long Short-Term Memory, or LSTM, recurrent neural networks expect three-dimensional input in the Keras Python deep learning library. If you have a long sequence of thousands of observations in your time series data, you must split your time series into samples and then reshape it for your LSTM model. In this tutorial, you will discover exactly how to prepare your univariate time series data for an LSTM model in Python with Keras. How to Prepare Univariate Time Series Data for Long Short-Term Memory Networks Photo by Miguel Mendez, some rights reserved.


Time Series Analysis with Generalized Additive Models

@machinelearnbot

Whenever you spot a trend plotted against time, you would be looking at a time series. The de facto choice for studying financial market performance and weather forecasts, time series are one of the most pervasive analysis techniques because of its inextricable relation to time--we are always interested to foretell the future. One intuitive way to make forecasts would be to refer to recent time points. Today's stock prices would likely be more similar to yesterday's prices than those from five years ago. Hence, we would give more weight to recent than to older prices in predicting today's price.


A simple deep learning model for stock price prediction using TensorFlow

@machinelearnbot

For a recent hackathon that we did at STATWORX, some of our team members scraped minutely S&P 500 data from the Google Finance API. The data consisted of index as well as stock prices of the S&P's 500 constituents. Having this data at hand, the idea of developing a deep learning model for predicting the S&P 500 index based on the 500 constituents prices one minute ago came immediately on my mind. Playing around with the data and building the deep learning model with TensorFlow was fun and so I decided to write my first Medium.com What you will read is not an in-depth tutorial, but more a high-level introduction to the important building blocks and concepts of TensorFlow models.


Time series Forecasting in Machine Learning – 99XTechnology – Medium

@machinelearnbot

Understanding timely patterns/characteristics in data are becoming very critical aspect in analyzing and describing trends in business data . Example Use case 1: Fitness device market is built around buy people to help track fitness related data to monitor effectiveness of their fitness exercises. Example Use Case 2: Sales growth of a product over period of time is a good indicator of sales performance of a product manufacturing company. A typical time series model can exhibits different patterns. Therefor it is important to understand components of a time series in detail .


Recurrent Neural Nets – The Third and Least Appreciated Leg of the AI Stool

@machinelearnbot

Summary: Convolutional Neural Nets are getting all the press but it's Recurrent Neural Nets that are the real workhorse of this generation of AI. We've paid a lot of attention lately to Convolutional Neural Nets (CNNs) as the cornerstone of 2nd gen NNs and spent some time on Spiking Neural Nets (SNNs) as the most likely path forward to 3rd gen, but we'd really be remiss if we didn't stop to recognize Recurrent Neural Nets (RNNs). Because RNNs are solid performers in the 2nd gen NN world and perform many tasks much better than CNNs. These include speech-to-text, language translation, and even automated captioning for images. By count, there are probably more applications for RNNs than for CNNs.


Trend Analysis of Fragmented Time Series: Hypothesis Testing Based Adaptive Spline Filtering Method

#artificialintelligence

Missing data present significant challenges to trend analysis of time series. Straightforward approaches consisting of supplementing missing data with constant or zero values or with linear trends can severely degrade the quality of the trend analysis, which significantly reduces the reliability of the trend analysis. We present a robust adaptive approach to discover the trends from fragmented time series. The approach proposed in this paper is based on the HASF (Hypothesis-testing-based Adaptive Spline Filtering) trend analysis algorithm, which can accommodate non-uniform sampling and is therefore inherently robust to missing data. HASF adapts the nodes of the spline based on hypothesis testing and variance minimization, which adds to its robustness.