Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis
ABSTRACT In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction vi a an information-theoretic approach. The fundamental bounds a re shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed. Index T erms -- Information-theoretic learning, sequential learning, sequential prediction, bounds on performan ce, sequence prediction 1. INTRODUCTION Nowadays machine learning techniques are becoming more and more prevalent in real-time systems such as real-time si g-nal processing, feedback control, and robotics systems. In such systems, on one hand, decisions on the actions are to be made in a sequential manner (sequential decision making); on the other hand, dynamics of the systems as well as the environment that are determined by physical laws will play an indispensable role and must be taken into consideration (interaction with real world).
Oct-23-2019
- Country:
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States (0.04)
- Europe > United Kingdom
- Genre:
- Research Report (0.40)
- Technology: