CUED Publications database

Recurrent neural network language models for keyword search

Chen, X and Ragni, A and Vasilakes, J and Liu, X and Knill, K and Gales, MJF (2017) Recurrent neural network language models for keyword search. In: UNSPECIFIED pp. 5775-5779..

Full text not available from this repository.


© 2017 IEEE. Recurrent neural network language models (RNNLMs) have becoming increasingly popular in many applications such as automatic speech recognition (ASR). Significant performance improvements in both perplexity and word error rate over standard n-gram LMs have been widely reported on ASR tasks. In contrast, published research on using RNNLMs for keyword search systems has been relatively limited. In this paper the application of RNNLMs for the IARPA Babel keyword search task is investigated. In order to supplement the limited acoustic transcription data, large amounts of web texts are also used in large vocabulary design and LM training. Various training criteria were then explored to improved RNNLMs' efficiency in both training and evaluation. Significant and consistent improvements on both keyword search and ASR tasks were obtained across all languages.

Item Type: Conference or Workshop Item (UNSPECIFIED)
Divisions: Div F > Machine Intelligence
Depositing User: Cron Job
Date Deposited: 15 Aug 2017 01:23
Last Modified: 31 May 2018 02:01