In Zusammenarbeit mit
The Computational Geometric Learning project aims at extending the success story of geometric algorithms with guarantees to high-dimensions. This is not a straightforward task. For many problems, no efficient algorithms exist that compute the exact solution in high dimensions. This behavior is commonly called the curse of dimensionality. We try to address the curse of dimensionality by focusing on inherent structure in the data like sparsity or low intrinsic dimension, and by resorting to fast approximation algorithms.
The goals of the project are best described along three axes, namely (1) building the foundations of a new field computational geometric learning, (2) developing algorithms and data structures that exploit structure in order to scale well with increasing dimension, i.e., avoiding the curse of dimensionality, and (3) providing robust and efficient implementations of foundational algorithms and data structures that come with theoretical guarantees for high-dimensional geometric data processing.