Results 1  10
of
26
Exponential Stability for Nonlinear Filtering
, 1996
"... We study the a.s. exponential stability of the optimal filter w.r.t. its initial conditions. A bound is provided on the exponential rate (equivalently, on the memory length of the filter) for a general setting both in discrete and in continuous time, in terms of Birkhoff's contraction coefficie ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
We study the a.s. exponential stability of the optimal filter w.r.t. its initial conditions. A bound is provided on the exponential rate (equivalently, on the memory length of the filter) for a general setting both in discrete and in continuous time, in terms of Birkhoff's contraction coefficient. Criteria for exponential stability and explicit bounds on the rate are given in the specific cases of a diffusion process on a compact manifold, and discrete time Markov chains on both continuous and discretecountable state spaces. R'esum'e Nous 'etudions la stabilit'e du filtre optimal par raport `a ses conditions initiales. Le taux de d'ecroissance exponentielle est calcul'e dans un cadre g'en'eral, pour temps discret et temps continu, en terme du coefficient de contraction de Birkhoff. Des crit`eres de stabilit'e exponentielle et des bornes explicites sur le taux sont calcul'ees pour les cas particuliers d'une diffusion sur une vari'ete compacte, ainsi que pour des chaines de Markov sur ...
Lyapunov Exponents for Finite State Nonlinear Filtering
 SIAM JOURNAL ON CONTROL AND OPTIMIZATION
, 1996
"... Consider the Wonham optimal filtering problem for a finite state ergodic Markov process in both discrete and continuous time, and let oe be the noise intensity for the observation. We examine the sensitivity of the solution with respect to the filter's initial conditions in terms of the gap bet ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Consider the Wonham optimal filtering problem for a finite state ergodic Markov process in both discrete and continuous time, and let oe be the noise intensity for the observation. We examine the sensitivity of the solution with respect to the filter's initial conditions in terms of the gap between the first two Lyapunov exponents of the Zakai equation for the unnormalized conditional probability. This gap is studied in the limit as oe ! 0 by techniques involving considerations of nonlinear filtering and the stochastic FeynmanKac formula. Conditions are given for the limit to be either negative or \Gamma1. Asymptotic bounds are derived in the latter case.
Analyticity of entropy rate of a hidden Markov chain
 In Proc. of IEEE International Symposium on Information Theory, Adelaide, Australia, September 4September 9 2005
, 1995
"... We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for t ..."
Abstract

Cited by 31 (13 self)
 Add to MetaCart
(Show Context)
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
Unsolved problems concerning random walks on trees, Classical and modern branching processes
 IMA Vol. Math. Appl
, 1994
"... ..."
(Show Context)
Asymptotics of the inputconstrained binary symmetric channel capacity
 Annals of Applied Probability
, 2009
"... We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
(Show Context)
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117–122], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result, we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible finite type constraint, as the noise parameter tends to zero. 1. Introduction and background. Let X,Y be discrete random variables with alphabet X,Y and joint probability mass function pX,Y (x,y) △ = P(X = x,Y = y), x ∈ X,y ∈ Y [for notational simplicity, we will write p(x,y) rather than pX,Y (x,y), similarly p(x),p(y) rather than pX(x),pY (y), resp., when it
Analytic Expansions of (max,+) Lyapunov Exponents
, 1998
"... We give an explicit analytic series expansion of the (max; +)Lyapunov exponent fl(p) of a sequence of independent and identically distributed random matrices in this algebra, generated via a Bernoulli scheme depending on a small parameter p. A key assumption is that one of the matrices has a unique ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
We give an explicit analytic series expansion of the (max; +)Lyapunov exponent fl(p) of a sequence of independent and identically distributed random matrices in this algebra, generated via a Bernoulli scheme depending on a small parameter p. A key assumption is that one of the matrices has a unique eigenvector. This allows us to use a representation of this exponent as the mean value of a certain random variable, and then a discrete analogue of the socalled lighttraffic perturbation formulas to derive the expansion. We show that it is analytic under a simple condition on p. This also provides a closed form expression for all derivatives of fl(p) at p = 0 and approximations of fl(p) of any order, together with an error estimate for nite order Taylor approximations. Several extensions of this are discussed, including expansions of multinomial schemes depending on small parameters (p 1, ..., p m ) and expansions for exponents associated with iterates of a class of random operators...
Derivatives of entropy rate in special families of hidden Markov chains
 IEEE TRANS. INFO. THEORY
, 2007
"... Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Recently Zuk et al. showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Recently Zuk et al. showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a naural class of hidden Markov chains called “Black Holes.” We also discuss in depth special cases of binary Markov chains observed in binarysymmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell.
Onedimensional finite range random walk in random medium and
, 2007
"... invariant measure equation ..."
(Show Context)
Analyticity of Entropy Rate in Families of Hidden Markov Chains
, 2008
"... We prove that under a mild positivity assumption the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. We give examples to show how this can fail in some cases. And we study two natural special classes of hidden Markov chains in more d ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
We prove that under a mild positivity assumption the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. We give examples to show how this can fail in some cases. And we study two natural special classes of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol and binary Markov chains corrupted by binary symmetric noise. Finally, we show that under the positivity assumption the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters.
Series Expansions of Lyapunov Exponents and Forgetful Monoids
, 2000
"... We consider Lyapunov exponents of random iterates of monotone homogeneous maps. We assume that the images of some iterates are lines, with positive probability. Using this memoryloss property which holds generically for random products of matrices over the maxplus semiring, and in particular, for ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
We consider Lyapunov exponents of random iterates of monotone homogeneous maps. We assume that the images of some iterates are lines, with positive probability. Using this memoryloss property which holds generically for random products of matrices over the maxplus semiring, and in particular, for Tetrislike heaps of pieces models, we give a series expansion formula for the Lyapunov exponent, as a function of the probability law. In the case of rational probability laws, we show that the Lyapunov exponent is an analytic function of the parameters of the law, in a domain that contains the absolute convergence domain of a partition function associated to a special "forgetful" monoid, defined by generators and relations.