#### DMCA

## THE ENTROPY RATE OF A BINARY CHANNEL WITH SLOWLY SWITCHING INPUT (2006)

Citations: | 1 - 0 self |

### Citations

12352 |
Elements of Information Theory
- Cover, Thomas
- 1991
(Show Context)
Citation Context ...tes of X ε and Y ε respectively. Then H(Y εn ) = H(X εn ) + H(Y εn |X εn ) − H(X εn |Y εn ) ≤ H(X εn ) + H(Y εn |X εn ), where H(Y εn |X εn ) is the conditional entropy of X εn , given Y εn (see e.g. =-=[5]-=-). Since Y ε is an i.i.d. sequence conditioned on X ε while the formula (1.1) implies H(Y εn |X εn ) = nh(p), H(X ε 1 ) = lim n→∞ n H(Xεn ) = −ελlog(ελ)+(1−ελ)log(1−ελ) = λε log ε −1� 1+o(1) � . Combi... |

59 |
The entropy of functions of finite-state Markov chains Trans. of the first Prague conference on information theory, statistical decision functions, Random Processes,
- Blackwell
- 1957
(Show Context)
Citation Context ...HIGANSKY The sequence Y is not Markov in general and in spite of its seemingly simple structure, not much is known about H(Y ). The problem of calculating the entropy rate of Y was first addressed in =-=[2]-=- in the special case Yn = g(Xn) for some deterministic function g : S ↦→ S. In this short paper D.Blackwell derived a formula for H(Y ) (see (2.3) below), whose main ingredient is the probability meas... |

36 |
A limit theorem for partially observed Markov chains.
- Kaijser
- 1975
(Show Context)
Citation Context ....2) would imply � H(Y ) = − Sd−1 d� � �d i=1 j=1 pji d� ℓ=1 λℓjuℓ � ��d log j=1 pji d� ℓ=1 ℓ=1 λℓjuℓ � M(du). (2.3) However in general π may have multiple invariant measures, as noted by T.Kaijser in =-=[6]-=- (see also a discussion of his counterexample in the context of nonlinear filtering [3]). Remarkably Blackwell’s formula (2.3) remains valid in this case as well, which follows from the result of Birc... |

35 | Lyapunov exponents for finite state nonlinear filtering
- Atar, Zeitouni
- 1997
(Show Context)
Citation Context ...3) to the same unique value! 3. Proof of (1.3) First we verify the following geometric inequality. Introduce H(p, q) := − � (1 − p)q + p(1 − q) � log � (1 − p)q + p(1 − q) � Lemma 3.1. For any q, p ∈ =-=[0, 1]-=- − � (1 − p)(1 − q) + pq � log � (1 − p)(1 − q) + pq � , p, q ∈ [0, 1], H(p, q) ≥ h(p) + 4 � log 2 − h(p) � q(1 − q). (3.1) Proof. Throughout we fix some p in [0, 1] and consider H(p, q) as a function... |

32 | Asymptotic stability of the Wonham filter: ergodic and nonergodic signals
- Baxendale, Chigansky, et al.
(Show Context)
Citation Context ...ℓ=1 ℓ=1 λℓjuℓ � M(du). (2.3) However in general π may have multiple invariant measures, as noted by T.Kaijser in [6] (see also a discussion of his counterexample in the context of nonlinear filtering =-=[3]-=-). Remarkably Blackwell’s formula (2.3) remains valid in this case as well, which follows from the result of Birch [4], based on convergence of certain associated martingale. In other words, any invar... |

24 | Approximations for the entropy for functions of Markov chains.
- Birch
- 1962
(Show Context)
Citation Context ...find explicitly and as indicated by several examples in [2] it may have quite complicated structure. Consequently much research in this direction focused on finding good estimates for H(Y ) (see e.g. =-=[4]-=-). On the positive side, the measure M tends to concentrate near the vertices of the simplex S d−1 when certain problem parameters are taken to limits, reflecting the fact that the chain X can be esti... |

20 | Asymptotic filtering for finite state Markov chains, Stochastic Process
- Khasminskii, Zeitouni
- 1996
(Show Context)
Citation Context ...≥ h(p) + 2 � log(2) − h(p) � min(q, 1 − q), q, p ∈ [0, 1], (3.5) which follows from the concavity of x ↦→ h(x) (see (3.2)). Now one can use the asymptotic for the maximum a posterior error derived in =-=[10]-=-: P � X ε 0 �= argmaxiπ ε 0(i) � � �d = The inequality (3.5) and the identity � λij µi Dij i=1 j�=i P � X ε 0 �= argmax i πε 0 (i)� = 1 − E max � π ε 0 , 1 − πε 0 � ε log ε −1� 1 + o(1) � , ε → 0. (3.... |

18 |
On the Entropy of a
- Jacquet, Seroussi, et al.
- 2004
(Show Context)
Citation Context ...lecting the fact that the chain X can be estimated exactly in the corresponding situations. This has a potential for explicit asymptotic estimates of H(Y ). Recently some progress have been reported (=-=[7]-=-, [8], [11], [12]) for the two dimensional case S = {0, 1}, which is usually referred as the binary channel. In particular, the slowly switching signal limit was considered in [11]. Let X ε = (X ε n )... |

13 | From Finite-System Entropy to Entropy Rate for a
- Zuk, Domany, et al.
- 2006
(Show Context)
Citation Context ... that the chain X can be estimated exactly in the corresponding situations. This has a potential for explicit asymptotic estimates of H(Y ). Recently some progress have been reported ([7], [8], [11], =-=[12]-=-) for the two dimensional case S = {0, 1}, which is usually referred as the binary channel. In particular, the slowly switching signal limit was considered in [11]. Let X ε = (X ε n )n≥1 be the Markov... |

9 |
Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime
- Nair, Ordentlich, et al.
(Show Context)
Citation Context ...e fact that the chain X can be estimated exactly in the corresponding situations. This has a potential for explicit asymptotic estimates of H(Y ). Recently some progress have been reported ([7], [8], =-=[11]-=-, [12]) for the two dimensional case S = {0, 1}, which is usually referred as the binary channel. In particular, the slowly switching signal limit was considered in [11]. Let X ε = (X ε n )n≥1 be the ... |

8 |
On filtering for a hidden Markov chain under square performance criterion,” Probl
- Golubev
- 2000
(Show Context)
Citation Context ...essions gives the upper bound in (1.2). A more serious effort is required to get the lower bound. The purpose of this note is to give an elementary proof of the following bound, using the result from =-=[9]-=-. �sε log(1/ε) coefficient 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 Comparison of the asymptotic lower bounds for H(Y) 4(log2−h p )/D p (1−2p) 2 /(1−p) 0 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 ch... |

6 |
Analyticity of Entropy Rate
- Han, Marcus
- 2006
(Show Context)
Citation Context ...ng the fact that the chain X can be estimated exactly in the corresponding situations. This has a potential for explicit asymptotic estimates of H(Y ). Recently some progress have been reported ([7], =-=[8]-=-, [11], [12]) for the two dimensional case S = {0, 1}, which is usually referred as the binary channel. In particular, the slowly switching signal limit was considered in [11]. Let X ε = (X ε n )n≥1 b... |

1 |
unpublished of Mathematics, The Weizmann Institute of Science, Rehovot 76100, Israel E-mail address: pavel.chigansky@weizmann.ac.il
- Zuk
(Show Context)
Citation Context ...) is tighter than (1.2) for higher p. It is worth mentioning that ceratin heuristic arguments indicate in favor of the exact asymptotic H(Y ) = h(p) + ε log ε −1 (1 + o(1) � , as conjectured by O.Zuk =-=[13]-=-. Remark 1.4. The case p = 1/2, when Y ε is an i.i.d. symmetric binary chain and hence H(Y ε ) ≡ log 2 independently of ε, is out of the scope of Proposition 1.1. This suggests that the coefficient of... |