Harrison, M and Kontoyiannis, I (2006) *On estimating the rate-distortion function.* In: UNSPECIFIED pp. 267-271..

## Abstract

Suppose a string Xn1 = (X1, X2, . . ., Xn) is generated by a stationary memoryless source (X n)n≥1 with unknown distribution P. When the source is finite-valued, the problem of estimating the entropy H(P) using the data X n1 has received a lot of attention. Perhaps the simplest method is the socalled plug-in estimator H(PXn1), where PXn1 is the empirical distribution of the data X n1. This estimator is always strongly consistent, that is, H(PXn1) → H(P) with probability one, as n → ∞. In this work we consider the natural generalization of estimating the rate-distortion function R(D, P). Our motivation comes from questions in lossy data compression and from cases where the data under consideration do not take values in a discrete alphabet. Our primary focus is the asymptotic behavior of the plug-in estimator R(PXn1, D). This estimator need not be consistent, but in many cases it is. Several extensions are also considered, including stationary ergodic sources, and instances where the rate-distortion function is defined over a restricted class of coding distributions. © 2006 IEEE.

Item Type: | Conference or Workshop Item (UNSPECIFIED) |
---|---|

Subjects: | UNSPECIFIED |

Divisions: | Div F > Signal Processing and Communications |

Depositing User: | Cron Job |

Date Deposited: | 08 Jan 2018 20:12 |

Last Modified: | 18 Aug 2020 12:42 |

DOI: | 10.1109/ISIT.2006.261847 |