A Proofs of Main Results

Neural Information Processing Systems 

We follow the setting defined in section 1. Throughout, we work in the so-called proportional We require, however, a slightly stronger condition in eq. This condition might appear strong. We extend the proof of the one-dimensional CL T to mixture models in section 2.4. We also provide further formal arguments in App. C. In particular, we argue that a large class of distributions, including Solving the problem of the dependence of the estimator over the data is the main mathematical difficulty in proving Thm. 2. This is achieved in the Appendix M for some constant M > 0. Building on those assumptions, [17] prove the following: Suppose that Assumptions A1-A3 hold. A.3 Sketch of proof of Lemma 8, adapted from [17] Interpolation path For any 0 t π/ 2, define U The main differences between our Assumptions 1-4 and Assumptions A1-A3 are the following: (i) Assumption 1 is unchanged.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found