Goto

Collaborating Authors

 abc


Appendix

Neural Information Processing Systems

We provide more information on AIPS' deductive engine and the training process for the value network. To highlight the reasoning ability and maintain readability of proofs, we avoid using brute-force methods such as augmentation-substitution and Wu's method Wu [1978].






Appendix AProofs A.1 ProofofLemma3.1

Neural Information Processing Systems

A.4 ProofofLemma4.5 The proof of Lemma 4.5 depends on another lemma, which will also be useful in the unknown hypothesis class setting. Then there exists somec for whichhc(x) / {0,1}, which is impossible. If qH(x) = c, we don't need to prove anything. Let h H denote the decision maker's decision rule. From Corollary 4.6, we know that h(x)=1 {qH(x) c}aslongasqH(x)6=c.




Pseudo-Likelihood Inference

Neural Information Processing Systems

Simulation-Based Inference (SBI) is a common name for an emerging family of approaches that infer the model parameters when the likelihood is intractable. Existing SBI methods either approximate the likelihood, such as Approximate Bayesian Computation (ABC) or directly model the posterior, such as Sequential Neural Posterior Estimation (SNPE). While ABC is efficient on low-dimensional problems, on higher-dimensional tasks, it is generally outperformed by SNPE, which leverages function approximation. In this paper, we propose Pseudo-Likelihood Inference (PLI), a new method that brings neural approximation into ABC, making it competitive on challenging Bayesian system identification tasks. By utilizing integral probability metrics, we introduce a smooth likelihood kernel with an adaptive bandwidth that is updated based on information-theoretic trust regions. Thanks to this formulation, our method (i) allows for optimizing neural posteriors via gradient descent, (ii) does not rely on summary statistics, and (iii) enables multiple observations as input. In comparison to SNPE, it leads to improved performance when more data is available. The effectiveness of PLI is evaluated on four classical SBI benchmark tasks and on a highly dynamic physical system, showing particular advantages on stochastic simulations and multi-modal posterior landscapes.


ABC: Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning

Neural Information Processing Systems

Existing semi-supervised learning (SSL) algorithms typically assume class-balanced datasets, although the class distributions of many real world datasets are imbalanced. In general, classifiers trained on a class-imbalanced dataset are biased toward the majority classes. This issue becomes more problematic for SSL algorithms because they utilize the biased prediction of unlabeled data for training. However, traditional class-imbalanced learning techniques, which are designed for labeled data, cannot be readily combined with SSL algorithms. We propose a scalable class-imbalanced SSL algorithm that can effectively use unlabeled data, while mitigating class imbalance by introducing an auxiliary balanced classifier (ABC) of a single layer, which is attached to a representation layer of an existing SSL algorithm. The ABC is trained with a class-balanced loss of a minibatch, while using high-quality representations learned from all data points in the minibatch using the backbone SSL algorithm to avoid overfitting and information loss. Moreover, we use consistency regularization, a recent SSL technique for utilizing unlabeled data in a modified way, to train the ABC to be balanced among the classes by selecting unlabeled data with the same probability for each class. The proposed algorithm achieves state-of-the-art performance in various class-imbalanced SSL experiments using four benchmark datasets.