Springe direkt zu Inhalt

Facets of Complexity

This two-day conference brings together researchers from mathematics, computer science, and physics that treat the subject complexity from various viewpoints.

Thursday, September 29
9:50 Opening
10:00 Till Tantau, Lübeck

A gentle introduction to applications of algorithmic metatheorems for space and circuit classes

11:00 Coffee break
11:30 Artur Czumaj, Warwick

Testing graph properties very efficiently

12:30 Lunch break
14:00 Stephan Mertens, Magdeburg

When formulas freeze: Phase transitions in computation

15:00 Harry Buhrmann, Amsterdam

Quantum computing and complexity theory

16:00 Coffee break
16:30 Jan Vybíral, Prag

Information-based complexity

Friday, September 30
10:00 Nati Linial, Jerusalem

The phase transition in high dimensions

11:00 Coffee break
11:30 Markus Bläser, Saarbrücken

Polynomial identity testing

Participation is free. Please register by sending an email with the following contents to Facets-ofComplexity-register@lists.fu-berlin.de:

Name:             _______
Affiliation/city: _______

I will attend the conference Facets of Complexity from: (default: Thursday Sept. 29, morning) until: (default: Friday Sept. 30, noon)
=================================================

If you would like to be updated about the program, to receive travel information or last-minute updates, you can register at our mailing list.

Time & Location

29.09.2016, 10:00 s.t. - 30.09.2016

Zuse-Institut Berlin (ZIB) 
Takustraße 7, 14195 Berlin 
main lecture hall, ground floor 
MapVisitor information with travel directions, hotels and restaurants

Abstracts

Thursday, September 29
9:50 Opening
10:00 Till Tantau, Lübeck

A Gentle Introduction to Applications of Algorithmic Metatheorems for Space and Circuit Classes

Algorithmic metatheorems state that if a problem can be described in a certainlogic and the inputs are structured in a certain way, then the problem can be solved with a certain amount of resources. As an example, by Courcelle's Theorem all monadic second-order ("in a certain logic") properties of graphs of bounded tree width ("structured in a certain way") can be solved in linear time ("with a certain amount of resources"). Such theorems have become a valuable tool in algorithmics: If a problem happens to have the right structure and can be described in the right logic, they immediately yield a (typically tight) upper bound on the time complexity of the problem. Perhaps even more importantly, several complex algorithms rely on algorithmic metatheorems internally to solve subproblems, which broadens the range of applications of these theorems considerably. This talk is intended as a gentle introduction to the ideas behind algorithmic metatheorems, especially behind some recent results concerning space and circuit classes, and tries to give a flavor of the range of their applications.

11:00 Coffee break
11:30 Artur Czumaj, Warwick

Testing Graph Properties very Efficiently

I will survey recent advances on the problem of testing graph properties. We will consider tthe generic problem that for a given input graph G and a given graph property P (e.g., P may mean bipartiteness, 3-colorability, or planarity), we would like to determine if G satisfies property P or not. While the exact problem as defined here is often known to be computationally very hard (e.g., NP-hard, or even undecidable), we will focus on a simpler task, and we will want to distinguish between the input graphs that satisfy property P from the graphs that are “far away” from satisfying property P. Being “far away” means that one has to modify the input graph in more than, say, 1% of its representation to obtain a graph satisfying property P.

We will survey recent results in this area and show that for many basic properties, one can test them in this framework very efficiently, often in sublinear-time, and sometimes even in constant time.

14:00 Stephan Mertens, Magdeburg

When Formulas Freeze: Phase Transitions in Computation

Random instances of NP-complete problems can often be solved in polynomial time. For many problems, the average-case complexity changes drastically as the parameters of the statistical ensemble are tuned to critical values, similar to the phase transitions in physics, where a tiny change of temperature turns a block of ice into water. In this talk I will review the phase transitions in two classical NP-complete problems: k-SAT and Integer-Partitioning, and the mathematical (and „physical“) methods used to analyze these transitions. In particular, I will explain the classical moment bounds, „equations of motion“ for algorithms and the powerful message passing algorithms for k-SAT.

15:00 Harry Buhrmann, Amsterdam

Quantum Computing and Complexity Theory

Quantum computers hold great promise as the next generation hardware. They are based on counter-intuitive phenomena from quantum mechanics, like superposition, interference, and entanglement. The basic building block of a quantum computer is a quantum bit or qubit, which unlike a classical bit can be in a quantum superposition (a simultaneous combination) of both 0 and 1. In the 1990s it was demonstrated that, for specific problems, quantum algorithms run on a quantum computer can massively outperform classical computers. The famous quantum algorithm of Peter Shor shows that a quantum computer can factor large numbers and thus breaks most of modern-day cryptography.

Recent years have witnessed important breakthroughs in the development of the quantum computer. The UC Santa Barbara group led by Martinis (now part of Google's Quantum Artificial Intelligence Lab) achieved a high-performance 9-qubit platform. Other labs around the world have reported similar progress. With this growth rate we will have 50 qubits within five years and small scale quantum computers are expected in 10-15 years. What is the power of a quantum computer and how does it relate to the classical computation paradigm? In this talk I will give a short introduction to quantum computing and will highlight some of the recent results and ideas relating classical and quantum information processing. No prior knowledge of quantum mechanics is necessary.

16:00 Coffee break
16:30 Jan Vybíral, Prag:

Information-Based Complexity

Many problems in physics, finance, biology and many other areas of science are described by functions of d variables, where d is in hundreds, thousands, or more. Typical algorithms of approximation theory for integration or approximation of such functions depend exponentially on d, an effect known ascurse of dimension. As the importance of high-dimensional approximation theory has grown steadily in the last decades, new algorithms were developed, which can overcome this drawback. In my talk I will give an overview of the field ofInformation Based Complexity, which studies systematically the dependence of the algorithms on the underlying dimensionof the problem. We will review the notation used, as well as the most important (sometimes rather surprising) results on (im)possibility of solving high-dimensional problems.

Friday, September 30
10:00 Nati Linial, Jerusalem

The Phase Transition in High Dimensions

Perhaps the most dramatic phenomenon in G(n,p) theory is the phase transition that random graphs undergo around around p=1/n and in particular the emergence of the giant component. In recent years, we studied an analogous model of random simplicial complexes. They, too, undergo a phase transition, but in dimensions 2 and above the phase transition is even more dramatic. My talk will be self-contained and all the necessary background will be provided.

This is based on joint papers with Lior Aronshtam, Tomasz Łuczak, Roy Meshulam and Yuval Peled.

11:00 Coffee break
11:30 Markus Bläser, Saarbrücken

Polynomial Identity Testing

Does randomness help to speed up computation? This is a fundamental question in computer science. Inspired by early successes like randomized primality tests, a significant amount of researcher believed for a long time that randomness indeed helps speeding up computations. In 1997, Impagliazzo and Wigderson proved that the existence of certain hard to compute functions implies that randomness can only give a speedup by a polynomial factor. Shortly after this, Agrawal, Kayal, and Saxena were able to give a deterministic polynomial time primality test.

We still do not know whether for every problem having a randomized polynomial time algorithm there is also a deterministic polynomial time one. In this talk, we will look at the flagship problem that has a randomized polynomial time algorithm, but for which we do not know whether a deterministic one exists: the so-called polynomial identity testing problem. We are given an arithmetic circuit computing some polynomial and we want to answer whether this polynomial is identically zero. While there is an easy randomized algorithm, we do not know whether a deterministic polynomial time algorithm exists and it can be shown that the existence of such an algorithm implies lower bounds for circuit size, a partial converse to the result by Impagliazzo and Wigderson. In this talk, we will introduce the polynomial identity testing problem and its variants and give an overview about deterministic algorithms for some special cases. In particular, we will present some recent results on deterministically computing the rank of symbolic matrices.

Contact Information

Günter Rote, department of computer science, Facets-ofComplexity-register@lists.fu-berlin.de.