Goto

Collaborating Authors

Uncertainty in Neural Processes

arXiv.org Machine Learning

We explore the effects of architecture and training objective choice on amortized posterior predictive inference in probabilistic conditional generative models. We aim this work to be a counterpoint to a recent trend in the literature that stresses achieving good samples when the amount of conditioning data is large. We instead focus our attention on the case where the amount of conditioning data is small. We highlight specific architecture and objective choices that we find lead to qualitative and quantitative improvement to posterior inference in this low data regime. Specifically we explore the effects of choices of pooling operator and variational family on posterior quality in neural processes. Superior posterior predictive samples drawn from our novel neural process architectures are demonstrated via image completion/in-painting experiments.


Printing Error Blamed for Wrong Sample Ballots

U.S. News

Republican Director of Elections Rick Stream says St. Louis County sends out election cards with the sample ballot printed on the back, a way of giving voters a preview of what they'll be voting on. The mailings went out Monday.


Nearly tight sample complexity bounds for learning mixtures of Gaussians via sample compression schemes

Neural Information Processing Systems

We prove that ϴ(k d 2 / ε 2) samples are necessary and sufficient for learning a mixture of k Gaussians in R d, up to error ε in total variation distance. This improves both the known upper bounds and lower bounds for this problem. For mixtures of axis-aligned Gaussians, we show that O(k d / ε 2) samples suffice, matching a known lower bound. The upper bound is based on a novel technique for distribution learning based on a notion of sample compression. Any class of distributions that allows such a sample compression scheme can also be learned with few samples.


Android Malware Rising: 350 New Infected Apps Appear Every Hour, Says New Report

International Business Times

Android Malware is on the rise, say researchers at G Data Security. A new report by the security firm revealed that in the first of 2017, over 750, 000 new malware apps were discovered. Android holds a 72 percent share of the mobile market and so it is reasonable that more attacks would happen on this platform. The number of malware samples cropping up each day are nonetheless staggering and there's no sign the problem will be corrected anytime soon. Since 2012, new Android malware samples have increased each year with the greatest hikes occurring over the last year.


Coarse classing and fine classing/Observation and Performance Window

@machinelearnbot

I would also appreciate if you could let me know as to how one goes about deciding on an optimum sample size before embarking on the analysis.For eg. I mean how will I know what size of the sample is good enough to come up with a good or a "champion" model? Is this an iterative process where we take different sample sizes and compare the models?Could you advise me on this?