Januschowski, Tim
A simple and effective predictive resource scaling heuristic for large-scale cloud applications
Flunkert, Valentin, Rebjock, Quentin, Castellon, Joel, Callot, Laurent, Januschowski, Tim
We propose a simple yet effective policy for the predictive auto-scaling of horizontally scalable applications running in cloud environments, where compute resources can only be added with a delay, and where the deployment throughput is limited. Our policy uses a probabilistic forecast of the workload to make scaling decisions dependent on the risk aversion of the application owner. We show in our experiments using real-world and synthetic data that this policy compares favorably to mathematically more sophisticated approaches as well as to simple benchmark policies.
Anomaly Detection at Scale: The Case for Deep Distributional Time Series Models
Ayed, Fadhel, Stella, Lorenzo, Januschowski, Tim, Gasthaus, Jan
This paper introduces a new methodology for detecting anomalies in time series data, with a primary application to monitoring the health of (micro-) services and cloud resources. The main novelty in our approach is that instead of modeling time series consisting of real values or vectors of real values, we model time series of probability distributions over real values (or vectors). This extension to time series of probability distributions allows the technique to be applied to the common scenario where the data is generated by requests coming in to a service, which is then aggregated at a fixed temporal frequency. Our method is amenable to streaming anomaly detection and scales to monitoring for anomalies on millions of time series. We show the superior accuracy of our method on synthetic and public real-world data. On the Yahoo Webscope data set, we outperform the state of the art in 3 out of 4 data sets and we show that we outperform popular open-source anomaly detection tools by up to 17% average improvement for a real-world data set.
GluonTS: Probabilistic Time Series Models in Python
Alexandrov, Alexander, Benidis, Konstantinos, Bohlke-Schneider, Michael, Flunkert, Valentin, Gasthaus, Jan, Januschowski, Tim, Maddix, Danielle C., Rangapuram, Syama, Salinas, David, Schulz, Jasper, Stella, Lorenzo, Tรผrkmen, Ali Caner, Wang, Yuyang
We introduce Gluon Time Series (GluonTS, available at https://gluon-ts.mxnet.io), a library for deep-learning-based time series modeling. GluonTS simplifies the development of and experimentation with time series models for common tasks such as forecasting or anomaly detection. It provides all necessary components and tools that scientists need for quickly building new models, for efficiently running and analyzing experiments and for evaluating model accuracy.
Deep Factors for Forecasting
Wang, Yuyang, Smola, Alex, Maddix, Danielle C., Gasthaus, Jan, Foster, Dean, Januschowski, Tim
Producing probabilistic forecasts for large collections of similar and/or dependent time series is a practically relevant and challenging task. Classical time series models fail to capture complex patterns in the data, and multivariate techniques struggle to scale to large problem sizes. Their reliance on strong structural assumptions makes them data-efficient, and allows them to provide uncertainty estimates. The converse is true for models based on deep neural networks, which can learn complex patterns and dependencies given enough data. In this paper, we propose a hybrid model that incorporates the benefits of both approaches. Our new method is data-driven and scalable via a latent, global, deep component. It also handles uncertainty through a local classical model. We provide both theoretical and empirical evidence for the soundness of our approach through a necessary and sufficient decomposition of exchangeable time series into a global and a local part. Our experiments demonstrate the advantages of our model both in term of data efficiency, accuracy and computational complexity.
Deep State Space Models for Time Series Forecasting
Rangapuram, Syama Sundar, Seeger, Matthias W., Gasthaus, Jan, Stella, Lorenzo, Wang, Yuyang, Januschowski, Tim
We present a novel approach to probabilistic time series forecasting that combines state space models with deep learning. By parametrizing a per-time-series linear state space model with a jointly-learned recurrent neural network, our method retains desired properties of state space models such as data efficiency and interpretability, while making use of the ability to learn complex patterns from raw data offered by deep learning approaches. Our method scales gracefully from regimes where little training data is available to regimes where data from millions of time series can be leveraged to learn accurate models. We provide qualitative as well as quantitative results with the proposed method, showing that it compares favorably to the state-of-the-art.
Deep State Space Models for Time Series Forecasting
Rangapuram, Syama Sundar, Seeger, Matthias W., Gasthaus, Jan, Stella, Lorenzo, Wang, Yuyang, Januschowski, Tim
We present a novel approach to probabilistic time series forecasting that combines state space models with deep learning. By parametrizing a per-time-series linear state space model with a jointly-learned recurrent neural network, our method retains desired properties of state space models such as data efficiency and interpretability, while making use of the ability to learn complex patterns from raw data offered by deep learning approaches. Our method scales gracefully from regimes where little training data is available to regimes where data from large collection of time series can be leveraged to learn accurate models. We provide qualitative as well as quantitative results with the proposed method, showing that it compares favorably to the state-of-the-art.
Approximate Bayesian Inference in Linear State Space Models for Intermittent Demand Forecasting at Scale
Seeger, Matthias, Rangapuram, Syama, Wang, Yuyang, Salinas, David, Gasthaus, Jan, Januschowski, Tim, Flunkert, Valentin
We present a scalable and robust Bayesian inference method for linear state space models. The method is applied to demand forecasting in the context of a large e-commerce platform, paying special attention to intermittent and bursty target statistics. Inference is approximated by the Newton-Raphson algorithm, reduced to linear-time Kalman smoothing, which allows us to operate on several orders of magnitude larger problems than previous related work. In a study on large real-world sales datasets, our method outperforms competing approaches on fast and medium moving items.