datashift
4aa13186c795a52ba88f5b822f4b77eb-Paper-Conference.pdf
Therefore, estimating how well a given model might perform on the new data is an important step toward reliable ML applications. This isverychallenging, however,asthedata distribution can change inflexible ways, and we may not haveanylabels on the new data, which is often the case in monitoring settings. In this paper, we propose a new distribution shift model, Sparse Joint Shift (SJS), which considers the joint shift of both labels and afew features.
41a6fd31aa2e75c3c6d427db3d17ea80-Supplemental.pdf
In order to accelerate the NES search phase, we generated the pool using the weight sharing schemes proposed by Random Search with WeightSharing[37]andDARTS[39]. Specifically, we trained one-shot weight-sharing models usingeachof these two algorithms, then we sampled architectures from the weightshared models uniformly at random to build the pool.