Kontoyiannis, I and Madiman, M (2012) *Sumset inequalities for differential entropy and mutual information.* In: UNSPECIFIED pp. 1261-1265..

## Abstract

The Plünnecke-Ruzsa sumset theory gives bounds connecting the cardinality of the sumset A + B defined as {a + b; a ε A, b ε B} with the cardinalities of the original sets A, B. For example, the sum-difference bound states that, |A+B| |A| |B| ≤ |A-B| 3, where A-B = {a-b; a ε A, b ε B}. Interpreting the differential entropy h(X) as (the logarithm of) the size of the effective support of X, the main results here are a series of natural information-theoretic analogs for these bounds. For example, the sum-difference bound becomes the new inequality, h(X + Y) + h(X) + h(Y) ≤ 3h(X - Y), for independent X, Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plünnecke-Ruzsa inequality, and the Balog-Szemerédi-Gowers lemma. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in several cases requiring substantially new proof strategies. The basic property that naturally replaces functional submodularity is the data processing property of mutual information. © 2012 IEEE.

Item Type: | Conference or Workshop Item (UNSPECIFIED) |
---|---|

Subjects: | UNSPECIFIED |

Divisions: | Div F > Signal Processing and Communications |

Depositing User: | Cron Job |

Date Deposited: | 08 Jan 2018 20:12 |

Last Modified: | 27 Oct 2020 07:12 |

DOI: | 10.1109/ISIT.2012.6283059 |