Goto

Collaborating Authors

 bacon


A Proof for Claim

Neural Information Processing Systems

CIFAR-10-L T, CIFAR-100-L T, ImageNet-100-L T, and Places-L T are 5, 80, 50, and 182 respectively. Our default training set of each dataset is summarized in Table 8.





1c364d98a5cdc426fd8c76fbb2c10e34-Supplemental-Conference.pdf

Neural Information Processing Systems

The way to instantiate BACON will be similar to MFN. The following Lemma will showthat Definition 1.2 can be extended toanalyzing functions from differentdomain. Let F = gL g1 γ, with gi being a multivariate polynomial. The inductive hypothesis is: fork 1, if zk[j] is linear sum ofB for all j, then zk+1[l]islinearsumsofB foralll. By definition ofz, we know thatzk+1 = gk(zk), where gk is a multivariate polynomial of finite degreed.


Towards Distribution-Agnostic Generalized Category Discovery

Neural Information Processing Systems

Data imbalance and open-ended distribution are two intrinsic characteristics of the real visual world. Though encouraging progress has been made in tackling each challenge separately, few works dedicated to combining them towards real-world scenarios. While several previous works have focused on classifying close-set samples and detecting open-set samples during testing, it's still essential to be able to classify unknown subjects as human beings. In this paper, we formally define a more realistic task as distribution-agnostic generalized category discovery (DA-GCD): generating fine-grained predictions for both close-and open-set classes in a long-tailed open-world setting.


A Proof for Claim

Neural Information Processing Systems

CIFAR-10-L T, CIFAR-100-L T, ImageNet-100-L T, and Places-L T are 5, 80, 50, and 182 respectively. Our default training set of each dataset is summarized in Table 8.