Dillon, Joshua
Likelihood Ratios for Out-of-Distribution Detection
Ren, Jie, Liu, Peter J., Fertig, Emily, Snoek, Jasper, Poplin, Ryan, Depristo, Mark, Dillon, Joshua, Lakshminarayanan, Balaji
Discriminative neural networks offer little or no performance guarantees when deployed on data not generated by the same process as the training distribution. On such out-of-distribution (OOD) inputs, the prediction may not only be erroneous, but confidently so, limiting the safe deployment of classifiers in real-world applications. One such challenging application is bacteria identification based on genomic sequences, which holds the promise of early detection of diseases, but requires a model that can output low confidence predictions on OOD genomic sequences from new bacteria that were not present in the training data. We introduce a genomics dataset for OOD detection that allows other researchers to benchmark progress on this important problem. We investigate deep generative model based approaches for OOD detection and observe that the likelihood score is heavily affected by population level background statistics.
Statistical Translation, Heat Kernels and Expected Distances
Dillon, Joshua, Mao, Yi, Lebanon, Guy, Zhang, Jian
High dimensional structured data such as text and images is often poorly understood and misrepresented in statistical modeling. The standard histogram representation suffers from high variance and performs poorly in general. We explore novel connections between statistical translation, heat kernels on manifolds and graphs, and expected distances. These connections provide a new framework for unsupervised metric learning for text documents. Experiments indicate that the resulting distances are generally superior to their more standard counterparts.