On Robust Optimal Transport: Computational Complexity, Low-rank Approximation, and Barycenter Computation

Le, Khang, Nguyen, Huy, Nguyen, Quang, Ho, Nhat, Pham, Tung, Bui, Hung

arXiv.org Machine Learning 

The recent advance in computation with optimal transport (OT) problem [12, 3, 13, 7, 20, 23, 17, 18] has led to a surge of interest in using that tool in various domains of machine learning and statistics. The range of its applications is broad, including deep generative models [4, 14, 32], scalable Bayes [29, 30], mixture and hierarchical models [21], and other applications [28, 25, 10, 15, 33, 31, 8]. The goal of optimal transport is to find a minimal cost of moving masses between (supports of) probability distributions. It is known that the estimation of transport cost is not robust when there are outliers. To deal with this issue, [34] proposed a trimmed version of optimal transport. In particular, they search for the truncated probability distributions such that the optimal transport cost between them is minimized. However, their trimmed optimal transport is non-trivial to compute, which hinders its usage in practical applications. Another line of works proposed using unbalanced optimal transport (UOT) to solve the sensitivity of optimal transport to outliers [5, 26]. More specifically, their idea is to assign as small as possible mass to outliers by relaxing the marginal constraints of OT through a penalty function such as the KL divergence.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found