Detail publikace

Strategies for Training Large Scale Neural Network Language Models

MIKOLOV, T. DEORAS, A. POVEY, D. BURGET, L. ČERNOCKÝ, J.

Originální název

Strategies for Training Large Scale Neural Network Language Models

Anglický název

Strategies for Training Large Scale Neural Network Language Models

Jazyk

en

Originální abstrakt

Techniques for effective training of recurrent neural network based language models are described, and new state-of-the-art results on standard speech recognition task are reported.

Anglický abstrakt

Techniques for effective training of recurrent neural network based language models are described, and new state-of-the-art results on standard speech recognition task are reported.

Dokumenty

BibTex


@inproceedings{BUT76453,
  author="Tomáš {Mikolov} and Anoop {Deoras} and Daniel {Povey} and Lukáš {Burget} and Jan {Černocký}",
  title="Strategies for Training Large Scale Neural Network Language Models",
  annote="Techniques for effective training of recurrent neural network based language
models are described, and new state-of-the-art results on standard speech
recognition task are reported.",
  address="IEEE Signal Processing Society",
  booktitle="Proceedings of ASRU 2011",
  chapter="76453",
  edition="NEUVEDEN",
  howpublished="print",
  institution="IEEE Signal Processing Society",
  year="2011",
  month="december",
  pages="196--201",
  publisher="IEEE Signal Processing Society",
  type="conference paper"
}