•    Freeware
  •    Shareware
  •    Research
  •    Localization Tools 20
  •    Publications 707
  •    Validators 2
  •    Mobile Apps 22
  •    Fonts 31
  •    Guidelines/ Draft Standards 3
  •    Documents 13
  •    General Tools 38
  •    NLP Tools 105
  •    Linguistic Resources 255
  Catalogue
BLEU (Papineni et al., 2002) is a widely used metric for machine translation evaluation. However, it fails to rate translations correctly for target languages that are morphologically rich and that have relatively free word order such as Hindi (Ramanathan et al., 2007). In this paper, we present METEOR-Hindi, an automatic evaluation metric for a machine translation system where the target language is Hindi. METEOR-Hindi is a modified version of the metric METEOR, containing features specific to Hindi. We make appropriate changes to METEOR’s alignment algorithm and the scoring technique. In our experiments, we observed that METEOR-Hindi achieved high correlation of 0.703 with human judgments significantly outperforming BLEU that had a correlation of only 0.271.
For Full Paper : Click Here

Added on February 29, 2012

17

  More Details
  • Product Type : Research Paper
  • License Type : Freeware
  • System Requirement : Not Applicable
  • Author : Ankush Gupta,Sriram Venkatapathy,Rajeev Sangal
Author Community Profile :
Similar / Suggested Resources