henryk140x180I share my time between Columbia University and the University of Warsaw. At Columbia, I am Professor of Computer Science and at Warsaw I am Professor of Applied Mathematics.

I am mostly interested in computational complexity of continuous problems. For most continuous problems, we have only partial information. For problems defined on spaces of functions, this partial information is usually given by a finite number of function values at some points which we can choose. One of the central issues is to determine how many pieces of information, or function values, are needed to solve the computational problem to within a prescribed accuracy. The partial information plays the main role in this line of research and that is why this field is called information-based complexity. The error and cost of algorithms, as well as the complexity, can be defined in various settings including the worst case, the average case, the randomized, the probababilistic, and the recently added setting of quantum computation.

The quantum setting assumes that we can perform computations on classical and quantum computers. Even though building a quantum computer remains a difficult task, a quantum computer can lower the complexity of many continuous problems. Usually we have exponential speedups between the worst case and the quantum complexities, and polynomial speedups between the randomized and quantum complexities for a number of continuous problems such that integration, path integration, approximation, the solution of ordinary or partial differential equations etc.

I am especially interested in tractability of multivariate problems which are defined on spaces of functions of d variables. The main emphasis is on large d, where d is in the hundreds or even thousands. There are many important practical computational problems with such a huge d. They occur in mathematical finance, physics, statistics etc.

Suppose we want to approximate a multivariate problem to within a prescribed error tolerance. Tractability means that we can do it at a cost which is polynomial in d and the reciprocal of the error tolerance. For typical multivariate problems defined on classical spaces we often have intractability in the worst case setting, since the complexity depends exponentially on the number d of variables. This is called the curse of dimensionality. To break this curse we can switch to the randomized or the average case settings. Sometimes this works, and the best example is probably Monte Carlo in the randomized setting for multivariate integration. Monte Carlo breaks the curse if the variances of the functions depend at most polynomially on d. Another alternative, which is recently used in many papers, is to stay in the worst case setting but use additional knowledge about the multivariate problem to shrink the class of functions. This leads to weighted classes of functions. Necessary and sufficient conditions for tractability, in terms of the weights, are known today for many multivariate problems. My recent papers are mostly devoted to tractability and quantum computation and can be found in the publication section of this site.