Supplementary Material for " Partial Optimal Transport with Applications on Positive-Unlabeled Learning '

Neural Information Processing Systems 

The proof involves 3 steps: 1. A null 5 (with a constant A > 2ξ) the GW formulation involves pairs of points. This yields the following cases: Case 1: a > 0. In that case, φ (γ) is a convex function, whose minimum on [0, 1] is reached for γ Using the development in Section 1.2.2 of the supplemental, we can establish that The partial-OT computation is based on a augmented problem with a dummy point and, as such, is convex. On the contrary, the GW problem is non-convex and, although the algorithm is proved to converge, there is no guarantee that the global optimum is reached. The quality of the solution is therefore highly dependent on the initialization.