Goto

Collaborating Authors

 cuturi



Learning Elastic Costs to Shape Monge Displacements

Neural Information Processing Systems

Given a source and a target probability measure, the Monge problem studies efficient ways to map the former onto the latter. This efficiency is quantified by defining a cost function between source and target data.






Scalable Gromov-Wasserstein Learning for Graph Partitioning and Matching

Hongteng Xu, Dixin Luo, Lawrence Carin

Neural Information Processing Systems

Graph According graphs, T =[ Tij indicates Figure Graph Besides Recall 12]: for connecting sub-graph 21,47,34 graph K isolated, bycalculating dgw(G, Gdc), whereGdc = G(Vdc,diag whose Figure indicates 2.2 Gr Multi-graph Distinct focus byintroducing Based frame propose acceleration 3.1 Inspired 48,49...


Distilled Wasserstein Learning for Word Embedding and Topic Modeling

Hongteng Xu, Wenlin Wang, Wei Liu, Lawrence Carin

Neural Information Processing Systems

Theworddistributions of topics, their optimal transports to the word distributions of documents, and the embeddings of words are learned in a unified framework. When learning thetopic model, weleverage adistilled underlying distance matrix toupdate the topic distributions and smoothly calculate the corresponding optimal transports.



2 Theoreticalsetting

Neural Information Processing Systems

Theoretically, the focus is on fittingalargeclassofproblems intoasingleMinMax frameworkandgeneralizing regularization techniques knownfrom classical optimal transport.