PSPU: Enhanced Positive and Unlabeled Learning by Leveraging Pseudo Supervision

Wang, Chengjie, Xu, Chengming, Gan, Zhenye, Hu, Jianlong, Zhu, Wenbing, Ma, Lizhuag

arXiv.org Artificial Intelligence 

Abstract--Positive and Unlabeled (PU) learning, a binary classification model trained with only positive and unlabeled data, generally suffers from overfitted risk estimation due to inconsistent data distributions. To address this, we introduce a pseudo-supervised PU learning framework (PSPU), in which we train the PU model first, use it to gather confident samples for the pseudo supervision, and then apply these supervision to correct the PU model's weights by leveraging non-PU objectives. Figure 1: Challenges in PU net: traditional PU net suffers I. Note that images of deer denote the positive Positive and Unlabeled (PU) learning is a binary classification samples, and that of goat denote the negative samples. Such task is widely applicable in different real-life domains, e.g., fraud recognition in financial fields [1], fake detection in recommendation strong selected completely at random (SCAR) assumption system [2], pathologic diagnosis in medical image that implies the distributions of both labeled and unlabeled processing, anomaly detection in industry, satellite image positive data are similar, which facilitates the usage the risk recognition, etc.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found