Information-based complexity (IBC) is the branch of computational complexity which studies problems for which the information is partial, contaminated, and priced.
To motivate these assumptions about information consider a continuous (as opposed to discrete) dynamical system. It consists of an equation that determines how the system evolves over time and an initial condition. Assume that the equation depends on coefficients that are real functions of a real variable. Since a digital computer can store only a finite set of numbers, these functions must be replaced by such finite sets (by, for example, evaluating the functions at a finite number of points). Therefore, we have only partial information about the equation of evolution and, similarly, about the initial conditions. Furthermore, the function values will be contaminated by round-off error. Finally, evaluating the functions can be expensive.
We regard IBC as a foundational subject and make as few assumptions as possible. We might assume, for example, that the problem is specified by a linear operator and that the information consists of linear functionals. Everything else is a consequence of the theory.
This gives the theory both generality and simplicity, leading to unity across a wide variety of applications.
The applications typically involve multivariate problems of science and engineering. Examples include high dimensional integration, ordinary and partial differential equations, nonlinear optimization, approximation, and ill-posed problems. Although the focus has been on the complexity of computations, the complexity of verification and implementation testing has also been studied.
A number of monographs have been written on IBC. Complexity and Information was published in 1998 in both hard and soft cover. Its style is informal and the goal is motivations and insight. There is a comprehensive bibliography of over 400 recent papers and books. Precise statements and proofs may be found in Information-Based Complexity, Academic Press, 1988 (with G. Wasilkowski and H. Wozniakowski). Earlier monographs are: Information, Uncertainty, Complexity, Addison-Wesley, 1983 (with G. Wasilkowski and H. Wozniakowski), and A General Theory of Optimal Algorithms, Academic Press, 1980 (with H. . Wozniakowski).
Other monographs include:
My recent expository papers include: