Goto

Collaborating Authors

SEA: A Combined Model for Heat Demand Prediction

arXiv.org Machine Learning

Heat demand prediction is a prominent research topic in the area of intelligent energy networks. It has been well recognized that periodicity is one of the important characteristics of heat demand. Seasonal-trend decomposition based on LOESS (STL) algorithm can analyze the periodicity of a heat demand series, and decompose the series into seasonal and trend components. Then, predicting the seasonal and trend components respectively, and combining their predictions together as the heat demand prediction is a possible way to predict heat demand. In this paper, STL-ENN-ARIMA (SEA), a combined model, was proposed based on the combination of the Elman neural network (ENN) and the autoregressive integrated moving average (ARIMA) model, which are commonly applied to heat demand prediction. ENN and ARIMA are used to predict seasonal and trend components, respectively. Experimental results demonstrate that the proposed SEA model has a promising performance.


Tutorial: Multistep Forecasting with Seasonal ARIMA in Python

#artificialintelligence

Looking at the ACF and PACF plots of the differenced series we see our first significant value at lag 4 for ACF and at the same lag 4 for the PACF which suggest to use p 4 and q 4. We also have a big value at lag 12 in the ACF plot which suggests our season is S 12 and since this lag is positive it suggests P 1 and Q 0. Since this is a differenced series for SARIMA we set d 1, and since the seasonal pattern is not stable over time we set D 0. All together this gives us a SARIMA(4,1,4)(1,0,0)[12] model. Next we run SARIMA with these values to fit a model on our training data. We can see here that the multi-step forecast of our SARIMA(4,1,4)(1,0,0)[12] model fits the testing data extremely well with an RMSE of 23.7! When you manually conduct a good time series analysis, as I have done here, it will be difficult to beat ARMA models for forecasting. I also ran grid search and found the best model to be SARIMA(1, 0, 1)x(1, 1, 1)[12] which had an AIC of 696.05.


How to Visualize Time Series Data: Tidy Forecasting in R

#artificialintelligence

Register for our blog to get new articles as we release them. There are a number of forecasting packages written in R to choose from, each with their own pros and cons. For almost a decade, the forecast package has been a rock-solid framework for time series forecasting. However, within the last year or so an official updated version has been released named fable which now follows tidy methods as opposed to base R. More recently, modeltime has been released and this also follows tidy methods.


LSTM-MSNet: Leveraging Forecasts on Sets of Related Time Series with Multiple Seasonal Patterns

arXiv.org Machine Learning

Generating forecasts for time series with multiple seasonal cycles is an important use-case for many industries nowadays. Accounting for the multi-seasonal patterns becomes necessary to generate more accurate and meaningful forecasts in these contexts. In this paper, we propose Long Short-Term Memory Multi-Seasonal Net (LSTM-MSNet), a decompositionbased, unified prediction framework to forecast time series with multiple seasonal patterns. The current state of the art in this space are typically univariate methods, in which the model parameters of each time series are estimated independently. Consequently, these models are unable to include key patterns and structures that may be shared by a collection of time series. In contrast, LSTM-MSNet is a globally trained Long Short-Term Memory network (LSTM), where a single prediction model is built across all the available time series to exploit the crossseries knowledge in a group of related time series. Furthermore, our methodology combines a series of state-of-the-art multiseasonal decomposition techniques to supplement the LSTM learning procedure. In our experiments, we are able to show that on datasets from disparate data sources, like e.g. the popular M4 forecasting competition, a decomposition step is beneficial, whereas in the common real-world situation of homogeneous series from a single application, exogenous seasonal variables or no seasonal preprocessing at all are better choices. All options are readily included in the framework and allow us to achieve competitive results for both cases, outperforming many state-ofthe-art multi-seasonal forecasting methods


Stock Price Change Forecasting with Time Series: SARIMAX

#artificialintelligence

There are different techniques for modeling a time series. One of them is the Autoregressive Process (AR). There, a time series problem can be expressed as a recursive regression problem where dependent variables are values of the target variable itself at different time instances. Let's say if Yt is our target variable there are a series of values Y1, Y2,… at different time instances, then, Parameter µ is the mean of the process. Parameter ф determines the amount of feedback and ɛt is information present at time t that can be added as extra.