Kontoyiannis, I and Madiman, M (2013) *The entropy of sums and Rusza's divergence on abelian groups.* In: UNSPECIFIED.

## Abstract

Motivated by a series of recently discovered inequalities for the sum and difference of discrete or continuous random variables [3], [5], [9], [10], we argue that the most natural, general form of these results is in terms of a special case of a mutual information, which we call the Ruzsa divergence between two probability distributions. This can be defined for arbitrary pairs of random variables taking values in any discrete (countable) set, on R n, or in fact on any locally compact Hausdorff abelian group. We study the basic properties of the Rusza divergence and derive numerous consequences. In particular, we show that many of the inequalities in [3], [5], [9], [10] can be stated and proved in a unified way, extending their validity to the present general setting. For example, consequences of the basic properties of the Ruzsa divergence developed here include the fact that the entropies of the sum and the difference of two independent random vectors severely constrain each other, as well as entropy analogues of a number of results in additive combinatorics. Although the setting is quite general, the results are already of interest (and new) in the case of random vectors in Rn. For instance, another consequence in Rn is an entropic analogue (in the setting of log-concave distributions) of the Rogers-Shephard inequality for convex bodies. © 2013 IEEE.

Item Type: | Conference or Workshop Item (UNSPECIFIED) |
---|---|

Subjects: | UNSPECIFIED |

Divisions: | Div F > Signal Processing and Communications |

Depositing User: | Cron Job |

Date Deposited: | 08 Jan 2018 20:12 |

Last Modified: | 27 Oct 2020 07:12 |

DOI: | 10.1109/ITW.2013.6691279 |