@MISC{Plaskota_noisyinformation:, author = {Leszek Plaskota and Leszek Plaskota}, title = {Noisy Information: Optimality, Complexity, Tractability}, year = {} }

Share

OpenURL

Abstract

Abstract In this paper, we present selected old and new results on the optimal solution of linear problems based on noisy information, where the noise is bounded or random. This is done in the framework of information-based complexity (IBC), and the main focus is on the following questions: (i) what is an optimal algorithm for given noisy information? (ii) what is the ε-complexity of a problem with noisy information? (iii) when is a multivariate problem with noisy information tractable? The answers are given for the worst case, average case, and randomized (Monte Carlo) settings. For (ii) and (iii) we present a computational model in which the cost of information depends on the noise level. For instance, for integrating a function f: D → R, available information may be given as y j = f (t j)+ x j, 1 ≤ j ≤ n, with x j i.i.d. ∼ N (0,σ2j). For this information one pays ∑nj=1 c(σ j) where c: [0,∞)→ [0,∞] is a given cost function. We will see how the complexity and tractability of linear multivariate problems depend on the cost function, and compare the obtained results with noiseless case, in which c ≡ 1. 1