CUED Publications database

Paraphrastic language models and combination with neural network language models

Liu, X and Gales, MJF and Woodland, PC (2013) Paraphrastic language models and combination with neural network language models. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. pp. 8421-8425. ISSN 1520-6149

Full text not available from this repository.

Abstract

In natural languages multiple word sequences can represent the same underlying meaning. Only modelling the observed surface word sequence can result in poor context coverage, for example, when using n-gram language models (LM). To handle this issue, paraphrastic LMs were proposed in previous research and successfully applied to a US English conversational telephone speech transcription task. In order to exploit the complementary characteristics of paraphrastic LMs and neural network LMs (NNLM), the combination between the two is investigated in this paper. To investigate paraphrastic LMs' generalization ability to other languages, experiments are conducted on a Mandarin Chinese broadcast speech transcription task. Using a paraphrastic multi-level LM modelling both word and phrase sequences, significant error rate reductions of 0.9% absolute (9% relative) and 0.5% absolute (5% relative) were obtained over the baseline n-gram and NNLM systems respectively, after a combination with word and phrase level NNLMs. © 2013 IEEE.

Item Type: Article
Subjects: UNSPECIFIED
Divisions: Div F > Machine Intelligence
Depositing User: Cron Job
Date Deposited: 17 Jul 2017 19:01
Last Modified: 17 Oct 2017 01:40
DOI: