Information-Based Complexity
IBCbook4

INFORMATION-BASED COMPLEXITY

J. F. Traub
Department of Computer Science, Columbia University

G. W. Wasilkowski
Department of Computer Science, University of Kentucky

H. Wozniakowski
Department of Computer Science, Columbia University
Institute of Applied Mathematics and Mechanics, University of Warsaw

Academic Press


The purpose of this book is to provide a comprehensive treatment of information-based complexity. We present theory and applications. In addition to integrating the work of many researchers, new results are also developed.

In two earlier books, we analyzed the worst case setting. Here we study the worst case, average case, probabilistic case, random, and asymptotic settings. The effect of noisy information is briefly discussed. Some open problems are also indicated.

We wish to acknowledge many debts. Our special thanks are to A. G. Werschulz who authored several sections, as indicated on the front page, suggested many improvements to the whole book and was of invaluable help in prepering the TeX version of of teh book and in prepering the two indices. Whe thank T. Boult who wrote the computer vision section. We are pleased to thank S. Kwapien for useful remarks and guidance concerning measure theory.

We appreciate valuable suggestions from M. A. Kowalski and K. Sikorski, and comments from M. A. Kon, D. Lee, E. W. Packel, and L. Plaskota. We also thank T. Orowan who superbly typed a number of the chapters in TeX.

We are please to thank the National Science Foundation (Grant ICT-85-17289 and DCR-86-03674) and the Advanced Research Projects Ajancy (Contract N00039-84-C-0165) for supporting the work reported here.

Table of Contents

Preface

xiii

Chapter 1. Overview

1

Chapter 2. Example: Continuous Binary Search

5

1. Introduction

5

1.1. Problem Formulation

6

1.2. Information

6

1.3. Model of Computation

6

2. Complexity

6

3. Worst Case Setting

8

4. Average Case Setting

10

5. Probabilistic Setting

12

6. Relative Error

13

6.1. Worst Case Setting

13

6.2. Average Case Setting

14

6.3. Probabilistic Setting

16

7. Comparison of Complexity

18

8. Mixed Settings

19

9. Noisy Information

20

10. Final Comments

20

Chapter 3. General Formulation

22

1. Introduction

22

2. Formulation

23

2.1. Problem Formulation

23

2.2. Information

25

2.3. Model of Computation

29

2.4. How Can We Compute Approximations?

32

3. Complexity in Worst Case, Average Case, and Probabilistic Settings

33

4. Asymptotic Setting

36

5. Randomization

37

Chapter 4. Worst Case Setting : Theory

39

1. Introduction

39

2. Radius and Diameter of Information

41

3. Algorithms

45

3.1. Local and Global Errors

45

3.2. Central Algorithms

46

3.3. Interpolatory Algorithms

48

4. Cardinality Number and Complexity

50

5. Linear Problems

52

5.1. Definition of Linear Problems

53

5.2. Adaption Versus Nonadaption

54

5.3. Optimal Information

61

5.4. Relations between nth Minimal Radii and Gelfand n-widths

66

5.5. Linear Algorithms

71

5.6. Optimal Linear Algorithms and Linear Kolmogorov n-widths

85

5.7. Spline Algorithms

89

5.8. Complexity

95

6. Different Error Criteria

99

6.1. Relative Error

99

6.2. Normalized Error

102

6.3. Convex and Symmetric Error

106

Chapter 5. Worst Case Setting : Applications

110

1. Introduction

110

2. Integration

110

2.1. Smooth Periodic Functions

112

2.3. Weighted Integration in a Reproducing Kernel Hilbert Space

123

3. Function Approximation

128

3.2. Smooth Nonperiodic Functions: Hilbert Case

134

3.3. Smooth Nonperiodic Functions: Banach Case

137

4. Computer Vision

138

5. Linear Partial Differential Equations

143

6. Integral Equations

157

7. Ill-Posed Problems

163

8. Optimization

166

9. Large Linear Systems

168

10. Eigenvalue Problem

172

11. Ordinary Differential Equations

176

12. Nonlinear Equations

177

13. Topological Degree

182

Chapter 6. Average Case Setting : Theory

184

1. Introduction

184

2. Radius of Information

186

3.1. Local and Global Average Errors

193

4. Average Cardinality Number and Complexity

201

5.1. Linear Problems in the Average Case Setting

205

5.2. Gaussian Measures

206

5.3. Central Algorithms

210

5.4. Spline Algorithms

214

5.5. Optimal Nonadaptive Information

219

5.6.1. Adaptive Information with Fixed Cardinality

223

5.7. Complexity

233

5.8. Linear Problems for Bounded Domains

241

5.8.1. Average Radius of Information

244

6. Different Error Criteria

251

6.1. Relative Error

252

6.2. Normalized Error

252

6.2. Normalized Error

261

6.3. General Error Functional

263

6.4. Precision Error

273

Chapter 7. Average Case Setting : Applications

277

1. Introduction

277

2.1. Smooth Functions

278

2.2. Weighted Integration in a Reproducing Kernel Hilbert Space

282

3. Function Approximation

289

3.1. Smooth Periodic Functions

289

3.2. Smooth Nonperiodic Functions

291

4. Ill-Posed Problems

295

Chapter 8. Probabilistic Setting

302

1. Introduction

302

3. Radius of Information

306

5. Linear Problems

308

5.2. Estimates of the Radius of Information

309

5.3. Optimal Information

312

5.5. Linear Problems for Bounded Domains

319

6. Different Error Criteria

321

6.2. Normalized Error

330

6.3. General Error Functional

332

7. Applications

333

Chapter 9. Comparison between Different Settings

338

1. Introduction

338

2. Integration of Smooth Functions

339

4. Approximation of Smooth Periodic Functions

344

5. Approximation of Smooth Nonperiodic Functions

346

Chapter 10. Asymptotic Setting

349

1. Introduction

349

2. Asymptotic and Worst Case Settings for Linear Problems

356

2.1. Optimal Algorithms

357

2.2. Optimal Information

362

2.3. Continuous Algorithms

367

3. Asymptotic and Worst Case Settings for Nonlinear Problems

368

3.1. Optimal Algorithms

368

3.2. Optimal Information

372

3.3. Applications

373

4.1. Optimal Algorithms

376

4.2. Rate of Convergence

380

4.3. Optimal Information

383

Chapter 11. Randomization

385

1. Introduction

385

2. Random Information and Random Algorithms

386

3. Average Case Setting

390

3.1. Linear Problems for the Whole Space

390

3.2. Linear Problems for Bounded Domains

393

4. Worst Case Setting

394

4.1. Function Approximation and Other Problems

394

4.1.1. Function Approximation

396

4.1.2. Maximum and Extremal Points

397

4.1.3. Function Inverse

398

4.1.4. Topological Degree

398

4.2. Integration

400

4.3. Linear Problems with Unrestricted Information

406

Chapter 12. Noisy Information

409

1. Introduction

409

2. Worst Case Setting with Deterministic Noise

409

2.1. Basic Definitions

409

2.2. Uniformly Bounded Noise

411

3. Average Case Setting with Random Noise

416

3.1. Basic Definitions

416

3.2. Normally Distributed Noise

417

3.3. Does Adaption Help?

418

3.3.1. Examples

418

3.3.2. Adaptive Choice of Observations Does Not Help

420

4. Mixed Setting

424

Appendix

427

1. Functional Analysis

427

1.1. Linear Spaces and Linear Operators

427

1.2. Linear Independence, Dimension, and Linear Subspaces

428

1.3. Norms and Continuous Linear Operators

428

1.4. Banach Spaces

430

1.5. Inner Products and Hilbert Spaces

430

1.5.1. Inner Products

430

1.5.2. Hilbert Spaces

431

1.5.3. Separable Hilbert Spaces

431

1.6. Bounded Operators on Hilbert Spaces

432

1.6.1. Bounded Functionals and Rieszís Theorem

432

1.6.2. Adjoint Operators

432

1.6.3. Orthogonal and Projection Operators

433

1.6.4. Spectrum

433

2. Measure Theory

435

2.1. Borel sigma-Field, Measurable Sets and Functions

435

2.2. Measures and Probability Measures

435

2.3. Integrals

436

2.4. Characteristic Functional

437

2.5. Mean Element

438

2.6. Covariance and Correlation Operators

438

2.7. Induced and Conditional Measures

439

2.8. Product Measures and Fubiniís Theorem

439

2.9. Gaussian Measures

440

2.9.1. Measure of a Ball

440

2.9.2. Induced and Conditional Measures

444

Bibliography

448

Author Index

484

Subject Index

490