CUED Publications database

Parameterised sigmoid and ReLU hidden activation functions for DNN acoustic modelling

Zhang, C and Woodland, PC (2015) Parameterised sigmoid and ReLU hidden activation functions for DNN acoustic modelling. In: UNSPECIFIED pp. 3224-3228..

Full text not available from this repository.

Abstract

Copyright © 2015 ISCA. The form of hidden activation functions has been always an important issue in deep neural network (DNN) design. The most common choices for acoustic modelling are the standard Sigmoid and rectified linear unit (ReLU), which are normally used with fixed function shapes and no adaptive parameters. Recently, there have been several papers that have studied the use of parameterised activation functions for both computer vision and speaker adaptation tasks. In this paper, we investigate generalised forms of both Sigmoid and ReLU with learnable parameters, as well as their integration with the standard DNN acoustic model training process. Experiments using conversational telephone speech (CTS) Mandarin data, result in an average of 3.4% and 2.0% relative word error rate (WER) reduction with Sigmoid and ReLU parameterisations.

Item Type: Conference or Workshop Item (UNSPECIFIED)
Subjects: UNSPECIFIED
Divisions: Div F > Machine Intelligence
Depositing User: Cron Job
Date Deposited: 17 Jul 2017 19:01
Last Modified: 28 Sep 2017 01:45
DOI: