Adaptive Negative Evidential Deep Learning for Open-set Semi-supervised Learning

Yu, Yang, Deng, Danruo, Liu, Furui, Jin, Yueming, Dou, Qi, Chen, Guangyong, Heng, Pheng-Ann

arXiv.org Artificial Intelligence 

Moreover, when we tackle a K-progress by propagating the label information from way classification problem with a large K, the binary detectors labeled data to unlabeled data (Berthelot et al. 2019; Xu et al. are less robust to identify outliers from such a complex 2021; Wang et al. 2022b; Zheng et al. 2022). Despite the dataset that contains multi-class information (Carbonneau success, SSL methods are deeply rooted in the closed-set assumption et al. 2018). One advanced method, evidential deep learning that labeled data, unlabeled data and test data share (EDL) (Sensoy, Kaplan, and Kandemir 2018) can explicitly the same predefined label set. In reality (Yu et al. 2020), such quantify the classification uncertainty corresponding an assumption may not always hold as we can only accurately to the unknown class, by treating the network's output as evidence control the label set of labeled data, while unlabeled for parameterizing the Dirichlet distribution according and test data may include outliers that belong to the novel to subjective logic (Jøsang 2016). Compared with Softmax classes that are not seen in labeled data.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found