Kontoyiannis, I and Harremoës, P and Johnson, O (2005) *Entropy and the law of small numbers.* IEEE Transactions on Information Theory, 51. pp. 466-472. ISSN 0018-9448

## Abstract

Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn = ∑i=1n Xi is the sum of the (possibly dependent) binary random variables X1, X 2,..., Xn, with E(Xi) = p i and E(Sn) = λ, then D(Psn ∥PO(λ)) ≤∑i=1n pi2 + ∑i=1nH(Xi) - H(X1, X2,...,Xn) where D(Psn ∥Po(λ)) is the relative entropy between the distribution of Sn and the Poisson(λ) distribution. The first term in this bound measures the individual smallness of the Xi and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the Xi are independe nt, the following sharper bound is established: D(Psn ∥PO(λ))≤1/λ∑i=1n pi3/1-p1 and it is also generalized to the case when the Xi are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution. © 2005 IEEE.

Item Type: | Article |
---|---|

Subjects: | UNSPECIFIED |

Divisions: | Div F > Signal Processing and Communications |

Depositing User: | Cron Job |

Date Deposited: | 08 Jan 2018 20:11 |

Last Modified: | 27 Oct 2020 07:12 |

DOI: | 10.1109/TIT.2004.840861 |