9th August 2022, 4:00pm - 5:00pm (GST)
Neural Optimal Transport
Solving optimal transport (OT) problems with neural networks has become widespread in machine learning. The majority of existing methods compute the OT cost and use it as the loss function to update the generator in generative models (Wasserstein GANs). In this presentation, I will discuss the absolutely different and recently appeared direction -- methods to compute the OT plan (map) and use it as the generative model itself. Recent advances in this field demonstrate that they provide comparable performance to WGANs. At the same time, these methods have a wide range of superior theoretical and practical properties.
The presentation will be mainly based on the recent work "Neural Optimal Transport" https://arxiv.org/abs/2201.12220. I am going to present a neural algorithm to compute OT plans (maps) for weak & strong transport costs. For this, I will discuss important theoretical properties of the duality of OT problems that make it possible to develop efficient practical learning algorithms. Practically, I will demonstrate the performance of the algorithm on the unpaired image-to-image style transfer task.
Alexander got a Bachelor’s degree in Mathematics and then a Master’s degree in Computer Science at Higher School of Economics. Currently, he is a PhD student (Computer Science) at Skolkovo Institute of Science and Technology and a researcher at Artificial Intelligence Research Institute. Alexander's research interests are at the junction of deep generative modeling and computational optimal transport (OT). He works on establishing efficient neural-network-based solvers for OT problems (computing OT costs, plans, barycenters, gradient flows) targeted at large-scale machine learning applications (image synthesis, image-to-image translation, image restoration, etc.).