CUED Publications database

Development of low entropy coding in a recurrent network.

Harpur, GF and Prager, RW (1996) Development of low entropy coding in a recurrent network. Network, 7. pp. 277-284. ISSN 0954-898X

Full text not available from this repository.

Abstract

In this paper we present an unsupervised neural network which exhibits competition between units via inhibitory feedback. The operation is such as to minimize reconstruction error, both for individual patterns, and over the entire training set. A key difference from networks which perform principal components analysis, or one of its variants, is the ability to converge to non-orthogonal weight values. We discuss the network's operation in relation to the twin goals of maximizing information transfer and minimizing code entropy, and show how the assignment of prior probabilities to network outputs can help to reduce entropy. We present results from two binary coding problems, and from experiments with image coding.

Item Type: Article
Subjects: UNSPECIFIED
Divisions: Div F > Machine Intelligence
Depositing User: Cron Job
Date Deposited: 07 Mar 2014 11:33
Last Modified: 30 Jun 2014 01:10
DOI: 10.1088/0954-898X/7/2/007

Actions (login required)

View Item