Talk page
Title:
Introduction to unbalanced optimal transport and its efficient computational solutions
Speaker:
Abstract:
Optimal transport operates on empirical distributions which may contain acquisition artifacts, such as outliers or noise, thereby hindering a robust calculation of the OT map. Additionally, it necessitates equal mass between the two distributions, which can be overly restrictive in certain machine learning or computer vision applications where distributions may have arbitrary masses, or when only a fraction of the total mass needs to be transported. Unbalanced Optimal Transport addresses the issue of rebalancing or removing some mass from the problem by relaxing the marginal conditions. Consequently, it is often considered to be more robust, to some extent, against these artifacts compared to its standard balanced counterpart. In this presentation, I will review several divergences for relaxing the marginals, ranging from vertical divergences like the Kullback-Leibler or the L2-norm, which allow for the removal of some mass, to horizontal ones, enabling a more robust formulation by redistributing the mass between the source and target distributions. Additionally, I will discuss efficient algorithms that do not necessitate additional regularization on the OT plan.
Link:
Workshop: