Results 1 
9 of
9
Uniform observability of hidden Markov models and filter stability for unstable signals
 Department of Operations Research and Financial Engineering, Princeton University, Princeton, NJ 08544 Email address: rvan@princeton.edu
"... A hidden Markov model is called observable if distinct initial laws give rise to distinct laws of the observation process. Observability implies stability of the nonlinear filter when the signal process is tight, but this need not be the case when the signal process is unstable. This paper introduce ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
(Show Context)
A hidden Markov model is called observable if distinct initial laws give rise to distinct laws of the observation process. Observability implies stability of the nonlinear filter when the signal process is tight, but this need not be the case when the signal process is unstable. This paper introduces a stronger notion of uniform observability which guarantees stability of the nonlinear filter in the absence of stability assumptions on the signal. By developing certain uniform approximation properties of convolution operators, we subsequently demonstrate that the uniform observability condition is satisfied for various classes of filtering models with white noise type observations. This includes the case of observable linear Gaussian filtering models, so that standard results on stability of the Kalman filter are obtained as a special case. 1. Introduction. In a classic paper, Blackwell and Dubins [2] have obtained the following remarkably general result. Let (Yk)k≥0 be a discrete time stochastic process which takes values in a Polish space, and consider the regular conditional probabilities P((Yk)k>m ∈ · Y0,...,Ym) and Q((Yk)k>m ∈ · Y0,...,Ym).
Discrete time nonlinear filters with informative observations are stable
 Electr. Commun. Probab
"... Abstract. The nonlinear filter associated with the discrete time signalobservation model (Xk, Yk) is known to forget its initial condition as k → ∞ regardless of the observation structure when the signal possesses sufficiently strong ergodic properties. Conversely, it stands to reason that if the ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The nonlinear filter associated with the discrete time signalobservation model (Xk, Yk) is known to forget its initial condition as k → ∞ regardless of the observation structure when the signal possesses sufficiently strong ergodic properties. Conversely, it stands to reason that if the observations are sufficiently informative, then the nonlinear filter should forget its initial condition regardless of any properties of the signal. We show that for observations of additive type Yk = h(Xk) + ξk with invertible observation function h (under mild regularity assumptions on h and on the distribution of the noise ξk), the filter is indeed stable in a weak sense without any assumptions at all on the signal process. If the signal satisfies a uniform continuity assumption, weak stability can be strengthened to stability in total variation. 1.
INTRINSIC METHODS IN FILTER STABILITY
"... Abstract. The purpose of this article is to survey some intrinsic methods for studying the stability of the nonlinear filter. By ‘intrinsic ’ we mean methods which directly exploit the fundamental representation of the filter as a conditional expectation through classical probabilistic techniques su ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The purpose of this article is to survey some intrinsic methods for studying the stability of the nonlinear filter. By ‘intrinsic ’ we mean methods which directly exploit the fundamental representation of the filter as a conditional expectation through classical probabilistic techniques such as change of measure, martingale convergence, coupling, etc. Beside their conceptual appeal and the additional insight gained into the filter stability problem, these methods allow one to establish stability of the filter under weaker conditions compared to other methods, e.g., to go beyond strong mixing signals, to reveal connections between filter stability and classical notions of observability, and to discover links to martingale convergence and information theory. 1. Inroduction Consider a pair of random sequences (X, Y) = (Xn, Yn)n∈Z+, where the signal component Xn takes values in a Polish space 1 S and the observation component Yn takes values in R p for some p ≥ 1. The classical filtering problem is to compute the conditional distribution πn(·) = P(Xn ∈ · F Y 0,n), (1.1) where F Y k,n stands for the σalgebra of events generated by Ym, k ≤ m ≤ n (similarly, we will use below the σalgebra F X k,n generated by Xm, k ≤ m ≤ n). Once πn is found, the optimal mean square estimate of f(Xn) can be calculated as E(f(Xn)F Y ∫ 0,n) = f(x) πn(dx) for any function f with Ef(Xn)  2 < ∞. If both X and (X, Y) are Markov processes, πn satisfies a recursive filtering equation. Specifically, let Λ and ν denote the transition probability and the initial distribution of X, i.e., for A ∈ B(S) ν(A) = P(X0 ∈ A), Λ(Xn−1, A) = P(Xn ∈ AF X 0,n−1)
Forgetting of the initial condition for the filter in general statespace hidden Markov chain: a coupling approach
 Electron. J. Probab
, 2009
"... E l e c t r o n J o u r n a l o ..."
(Show Context)
Observability and nonlinear filtering
"... Abstract. This paper develops a connection between the asymptotic stability of nonlinear filters and a notion of observability. We consider a general class of hidden Markov models in continuous time with compact signal state space, and call such a model observable if no two initial measures of the s ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. This paper develops a connection between the asymptotic stability of nonlinear filters and a notion of observability. We consider a general class of hidden Markov models in continuous time with compact signal state space, and call such a model observable if no two initial measures of the signal process give rise to the same law of the observation process. We demonstrate that observability implies stability of the filter, i.e., the filtered estimates become insensitive to the initial measure at large times. For the special case where the signal is a finitestate Markov process and the observations are of the white noise type, a complete (necessary and sufficient) characterization of filter stability is obtained in terms of a slightly weaker detectability condition. In addition to observability, the role of controllability in filter stability is explored. Finally, the results are partially extended to noncompact signal state spaces.
UNIFORM OBSERVABILITY OF HIDDEN MARKOV MODELS AND FILTER STABILITY FOR UNSTABLE SIGNALS By Ramon van Handel
, 804
"... A hidden Markov model is called observable if distinct initial laws give rise to distinct laws of the observation process. Observability implies stability of the nonlinear filter when the signal process is tight, but this need not be the case when the signal process is unstable. This paper introduce ..."
Abstract
 Add to MetaCart
(Show Context)
A hidden Markov model is called observable if distinct initial laws give rise to distinct laws of the observation process. Observability implies stability of the nonlinear filter when the signal process is tight, but this need not be the case when the signal process is unstable. This paper introduces a stronger notion of uniform observability which guarantees stability of the nonlinear filter in the absence of stability assumptions on the signal. By developing certain uniform approximation properties of convolution operators, we subsequently demonstrate that the uniform observability condition is satisfied for various classes of filtering models with whitenoise type observations. This includes the case of observable linear Gaussian filtering models, so that standard results on stability of the Kalman–Bucy filter are obtained as a special case. 1. Introduction. In a classic paper, Blackwell and Dubins [2] have obtained the following remarkably general result. Let (Yk)k≥0 be a discrete time
Effects of statistical dependence on multiple testing under a hidden Markov model Running title: Likelihood ratio under HMM
, 2009
"... The performance of multiple hypothesis testing is known to be affected by the statistical dependence among random variables involved. The mechanisms responsible for this, however, are not well understood. We study the effects of the dependence structure of a finite state hidden Markov model (HMM) on ..."
Abstract
 Add to MetaCart
(Show Context)
The performance of multiple hypothesis testing is known to be affected by the statistical dependence among random variables involved. The mechanisms responsible for this, however, are not well understood. We study the effects of the dependence structure of a finite state hidden Markov model (HMM) on the likelihood ratios critical for optimal multiple testing on the hidden states. Various convergence results are obtained for the likelihood ratios as the observations of the HMM form an increasing long chain. Analytic expansions of the first and second order derivatives are obtained for the case of binary states, explicitly showing the effects of the parameters of the HMM on the likelihood ratios. Key words and phrases. HMM; multiple hypothesis testing; FDR; contraction; nonlinear filtering. AMS 2000 subject classification: 62H15; 62M02. 1
Markov chain: a coupling approach
"... Forgetting of the initial condition for the ..."
(Show Context)