Generative diffusion models and normalising flows
Organizers: Carsten Hartmann, Sebastian Reich
Abstract:
Generative diffusion models can transform a data distribution to noise and remove the noise by a reverse transformation to obtain new data with a distribution similar to the original one; they offer state of the art performance in generative AI for images. Normalizing Flows are deterministic or stochastic generative models that produce tractable distributions where both sampling and parameter estimation can be exact or efficient; they have strong connections to data assimilation and optimal control. The goal of this minisymposium is to survey recent progress in the aforementioned fields of research and to discuss connections between them. A particular focus is on the application to multiscale problems and stochastic control.
