Springe direkt zu Inhalt

Winter Term 2018/2019

Sep 13, 2018 - Feb 07, 2019

Hosts: Prof. Dr. R. Klein (FU), Prof. Dr. R. Kornhuber (FU), Prof. Dr. C. Schütte (FU/ZIB)
Location: Freie Universität Berlin, Institut für Mathematik, Arnimallee 6, 14195 Berlin-Dahlem, Room: 032 ground floor
Time: The seminar takes place on Thursday at 4:00 pm


Thursday, 13.09.2018: Lecture

Krikamol Muandet, Max Planck Institute for Intelligent Systems, Tübingen
Counterfactual Policy Evaluation and Optimization in Reproducing Kernel Hilbert Spaces

In this talk, I will discuss the problem of evaluating and learning optimal policies directly from observational (i.e., non-randomized) data using a novel framework called counterfactual mean embedding (CME). Identifying optimal policies is crucial for improving decision-making in online advertisement, finance, economics, and medical diagnosis. Classical approach, which is considered a gold standard for identifying optimal policies in these domains, is randomization. For example, an A/B testing has become one of the standard tools in online advertisement and recommendation systems. In medical domains, developments of new medical treatments depend exclusively on clinical controlled trials. Unfortunately, a randomization in A/B testing and clinical controlled trial may be expensive, time-consuming, unethical, or even impossible to implement in practice. To evaluate the policy from observational data, the CME maps the counterfactual distributions of potential outcomes under different treatments into a reproducing kernel Hilbert space (RKHS). Based on this representation, causal reasoning about the outcomes of different treatments can be performed over the entire landscape of counterfactual distribution using the kernel arsenal. Under some technical assumptions, we can also make a causal explanation of the resulting policies.

Joint work with Sorawit Saengkyongam, Motonobu Kanagawa, and Sanparith Marukatat.


Thursday, 18.10.2018: Lecture

Omar Knio, King Abdullah University of Science & Technology (KAUST)
Parameter calibration in general circulation models using polynomial chaos surrogates

This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to calibrate wind drag parametrizations, and to characterize the impact of initial conditions on the evolution of tropical cyclones.


Thursday, 01.11.2018: Colloquium

Annette Müller, Inst. für Meteorologie, FU Berlin: "What is... the Dynamic State Index?"

The Dynamic State Index (DSI) is a scalar diagnostic field that quantifies local deviations from a steady and adiabatic wind solution. Thus, this parameter indicates non-stationarity as well as diabaticity. The DSI-concept has originally been developed through the Energy-Vorticity Theory based on the full compressible flow equations.
Additional Dynamic State Indices that are based on reduced models of atmospheric motions, the quasi-geostrophic (QG-) theory and the Rossby-model, provide the opportunity for a scale-dependent diagnoses of various atmospheric processes. Applying the COSMO-DE data set of the German Weather Service and ECMWF’s ERA-INTERIM data set, it can be shown that each DSI captures different characteristics of scale-dependent atmospheric processes. The DSI for the primitive equations is highly correlated to precipitation on convective scale, whereas the DSI for the QG-model indicates precipitation clusters up to ‚Großwetterlagen‘ on synoptic scale. Finally, the DSI for the Rossby model can be used to diagnose larger scale weather situations on the synoptic scale, such as atmospheric blockings.

Robert Polzin, Freie Universität Berlin: "What is... a coherent set?"

Dynamical systems often exhibit the emergence of long-lived coherent sets. These sets are regions in state space that keep their geometric integrity to a high extent. In this talk, a well-known method for extracting coherent sets from possibly sparse Lagrangian trajectory data is discussed. This method can be seen as an extension of diffusion maps to trajectory space. It reveals the intrinsic low-dimensional organization of the data with respect to transport.


Thursday, 15.11.2018: Colloquium

Michiel Renger, WIAS Berlin
"What are... Reaction Fluxes?"

As usual in thermodynamics (and many SFB1114 projects), chemical reaction networks can be studied on at least two different scales: a microscopic system of randomly reacting particles, and macroscopic concentrations that follow the path of minimal action. Mathematically, the challenge is to bridge both levels of description, and to study further scaling limits using the action functional. Physically, one can study the action to derive thermodynamic properties of the system. Our main philosophy is that both mathematically and physically, it is beneficial to take more information into account than just the concentrations...


Thursday, 29.11.2018: Lecture

Anatoly Kolomeisky, Rice University
How to Understand the Formation of Signaling Profiles in Biological Development

Concentration profiles of signaling molecules, also known as morphogen gradients, play a critical role in the development of multi-cellular organisms. A widely used approach to explain the establishment of morphogen gradients assumes that signaling molecules are produced locally, spread via a free diffusion and degraded uniformly. However, recent experiments also produced controversial observations concerning this theoretical picture. It was shown that times to establish the morphogen gradient yield linear scaling as a function of length, not expected for the systems with unbiased diffusion. We propose a theoretical approach based on discrete-state stochastic analysis that provides a possible microscopic mechanism for these complex phenomena. It is argued that relaxation times are mostly determined by first-passage events, and the degradation effectively accelerates diffusion of signaling particles by removing slow molecules. Thus the degradation works as an effective potential that drives signaling molecules away from the source. Furthermore, we analyzed a direct-delivery mechanism of the formation of signaling profiles via cellular extensions known as cytonemes. Different mechanisms of the formation of morphogen gradients are compared. Our theoretical analysis indicates that spatial and temporal features of degradation efficiently control the establishment of signaling profiles.


Thursday, 06.12.2018: Lecture

Rainer Klages, Queen Mary University of London AND TU Berlin
Statistical Physics and Anomalous Dynamics of Foraging

A question that attracted a lot of attention in the past two decades is whether biologically relevant search strategies can be identified by statistical data analysis and mathematical modeling [1,2]. A famous paradigm in this field is the Lévy Flight Foraging Hypothesis. It states that under certain mathematical conditions Lévy dynamics, which defines a key concept in the theory of anomalous stochastic processes, leads to an optimal search strategy for foraging organisms. This hypothesis is discussed controversially in the current literature. One problem is that Lévy dynamics implies scale-freeness while in complex systems, and especially in biological dynamics, one might expect to see a hierarchy of different spatio-temporal scales. In my talk I will review examples and counterexamples of experimental data and their analyses confirming and refuting the Lévy Flight Foraging Hypothesis. Related to this debate is own work about the biophysical modeling of bumblebee flights under predation threat based on experimental data analysis, which I briefly outline [2].

[1] R. Klages, Extrem gesucht, Physik Journal 14, 22 (2015); Search for food of birds, fish and insects, chapter in: A. Bunde et al. (Eds.), Diffusive Spreading in Nature, Technology and Society (Springer, Berlin, 2018)
[2] F. Lenz et al., Phys. Rev. Lett. 108, 098103 (2012); PLoS ONE 8, e59036 (2013)


Thursday, 13.12.2018: Lecture

Oliver Junge, TU München
Robust FEM-based extraction of finite-time coherent sets using scattered, sparse, and incomplete trajectories

Transport and mixing properties of aperiodic flows are crucial to a dynamical analysis of the flow, and often have to be carried out with limited information. Finite-time coherent sets are regions of the flow that minimally mix with the remainder of the flow domain over the finite period of time considered. In the purely advective setting this is equivalent to identifying sets whose boundary interfaces remain small throughout their finite-time evolution. Finite-time coherent sets thus provide a skeleton of distinct regions around which more turbulent flow occurs. They manifest in geophysical systems in the forms of e.g. ocean eddies, ocean gyres, and atmospheric vortices. In real-world settings, often observational data is scattered and sparse, which makes the difficult problem of coherent set identification and tracking even more challenging. We develop three FEM-based numerical methods to rapidly and reliably extract finite-time coherent sets from models or scattered, possibly sparse, and possibly incomplete observed data.


Thursday, 10.01.2019: Colloquium

Felix Höfling, Freie Universität Berlin: “What is... massively parallel computing?"

In the past two decades, computing architectures have seen a paradigm shift towards on-chip parallelisation, which has been perfected in many-core accelerators (GPUs, MICs). Such processors have boosted the current success of machine learning approaches and, being versatile devices for general computing tasks, find applications in diverse other fields. A single chip can perform several thousand computations simultaneously, at the price of a reduced flexibility of the instruction pipeline. Making this technology accessible to a larger class of applications requires numerical tasks that exhibit a massive parallelism, which may be achieved by rethinking our implementations of standard algorithms.

Tobias Kramer, ZIB: "What is... efficiently solving hierarchical equations of motion?"

The hierarchical equations of motions provide an exact solution for open quantum system dynamics, also for larger systems. An important application is the description of energy transfer in photosynthetic systems from the antenna to the reaction center and the computation of the corresponding time-resolved spectra. The methods captures non-Markovian effects and strong system-environment interactions, which affect the thermalization and decoherence process. On parallel and distributed computers we provide an efficient implementation of the method [1], which is also available as GPU cloud computing tool at nanohub.org [2]. The obtained data is efficiently compressed by using neural networks and machine learning [3].
Reference:
[1] T. Kramer, M. Noack, A. Reinefeld, M. Rodriguez, Y. Zelinskyy: Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with Distributed Memory HEOM (DM-HEOM); Journal of Computational Chemistry (2018),
https://doi.org/10.1002/jcc.25354
[2] nanoHUB.org: Exciton Dynamics Lab for Light-Harvesting Complexes (GPU-HEOM)
https://nanohub.org/resources/gpuheompop/usage
[3] M. Rodriguez, T. Kramer: Machine Learning of Two-Dimensional Spectroscopic Data
https://arxiv.org/abs/1810.01124


Thursday, 24.01.2019: Lecture

Freddy Bouchet, ENS de Lyon et CNRS
Climate extremes and rare trajectories in astronomy computed using rare event algorithms and large deviation theory

I will discuss a set of recent developments in non-equilibrium statistical mechanics applied to climate and the solar system dynamics. The first application will be extreme heat waves as an example of rare events with huge impacts. Using a large deviation algortihm with a climate model (GCM), we were able to gain two orders of magnitude for the estimation of very rare events, and study phenomena that can not be studied otherwise. The second application will be the study of rare trajectories that change the structure of a planetary system. Their understanding involves large deviation and instanton theory.


Thursday, 07.02.2019: Lecture

Ramon Grima, University of Edinburgh
Spatial stochastic models of intracellular dynamics in dilute and crowded conditions

Stochastic effects in biochemical reaction systems are commonly studied by means of the Reaction-Diffusion Master equation (RDME). The RDME is computationally efficient and has the advantage of being amenable to analysis. However the RDME assumes point particle interactions, i.e. dilute conditions. This presents a problem because the intracellular environment can be highly crowded with up to 40% of its volume being occupied by various macromolecules. In this talk I will discuss our recent work showing how the RDME can be modified to take into account volume-excluded interactions. This can be solved in certain conditions yielding explicit expressions for the dependence of reactant number fluctuations on the available volume fraction of space. I will also discuss how one can use the modified RDME to derive coupled nonlinear partial differential equations which describe molecular movement in highly heterogeneous crowded environments and which hence offer a realistic alternative to the classical diffusion equation. All results will be contrasted with those obtained from Brownian dynamics.