4 Discussions and open problems

Neural Information Processing Systems 

We discuss the assumptions and implications of our results as well as related open problems. Other loss functions As mentioned in Section 1.1, standard arguments based on concentration Lemma 5 crucially relies on stationarity. Theorems 6.3 and 6.5], which, in turn, follow the arguments of [ In this section we prove the optimal lower bound in Theorem 7 for three states. Finally, we relate (39) formally to the minimax prediction risk under the KL divergence. In this section, we rigorously carry out the lower bound proof sketched in Section 3.2: In Section In Section 6.2.2, we make the steps in In Section 6.2.3, we choose a prior distribution on the transition probabilities and prove a lower bound on the resulting mutual information, thereby completing the proof of Theorem 1, with the added bonus that the construction is restricted to irreducible and M is shown in Figure 2. One can also verify that the spectral gap of M is Θ( We make use of the properties (P1)-(P3) in Section 6.2.1 to prove Lemma 9. Proof of Lemma 9.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found