Goto

Collaborating Authors

 doeblin


4d215ab7508a3e089af43fb605dd27d1-Supplemental.pdf

Neural Information Processing Systems

Providing a very low critical probabilitypc means that certification occurs when the simulation ends after alarge number of iterationsm. On the other hand, the projection of X onto any other direction orthogonal tog remains normal distributed. For each couple of parameters(N,T)we make1000 runs and count the number of false positive(i.e. the number of times the algorithm wrongfully asserted thatp < pc). Combining the latter proposal withT we obtain again a proposal reversible w.r.t. Step4: Conclusionbyinduction Let l0 be any critical level such thatπ0(h(X) > l0) > 0. We consider the following induction hypothesisatiterationk: Hk On the event, Lk l0, The probability that the two particle systems are equal tends exponentiallyfastto1whent + .



Change Detection of Markov Kernels with Unknown Post Change Kernel using Maximum Mean Discrepancy

Chen, Hao, Tang, Jiacheng, Gupta, Abhishek

arXiv.org Machine Learning

In this paper, we develop a new change detection algorithm for detecting a change in the Markov kernel over a metric space in which the post-change kernel is unknown. Under the assumption that the pre- and post-change Markov kernel is geometrically ergodic, we derive an upper bound on the mean delay and a lower bound on the mean time between false alarms.


Noisy Neural Networks and Generalizations

Siegelmann, Hava T., Roitershtein, Alexander, Ben-Hur, Asa

Neural Information Processing Systems

In this paper we define a probabilistic computational model which generalizes many noisy neural network models, including the recent work of Maass and Sontag [5]. We identify weak ergodicjty as the mechanism responsible for restriction of the computational power of probabilistic models to definite languages, independent of the characteristics of the noise: whether it is discrete or analog, or if it depends on the input or not, and independent of whether the variables are discrete or continuous. We give examples of weakly ergodic models including noisy computational systems with noise depending on the current state and inputs, aggregate models, and computational systems which update in continuous time. 1 Introduction Noisy neural networks were recently examined, e.g.


Noisy Neural Networks and Generalizations

Siegelmann, Hava T., Roitershtein, Alexander, Ben-Hur, Asa

Neural Information Processing Systems

In this paper we define a probabilistic computational model which generalizes many noisy neural network models, including the recent work of Maass and Sontag [5]. We identify weak ergodicjty as the mechanism responsible for restriction of the computational power of probabilistic models to definite languages, independent of the characteristics of the noise: whether it is discrete or analog, or if it depends on the input or not, and independent of whether the variables are discrete or continuous. We give examples of weakly ergodic models including noisy computational systems with noise depending on the current state and inputs, aggregate models, and computational systems which update in continuous time. 1 Introduction Noisy neural networks were recently examined, e.g.