In this article we describe HiFST, a lattice-based decoder for hierarchical phrase-based translation and alignment. The decoder is implemented with standard Weighted Finite-State Transducer (WFST) operations as an alternative to the well-known cube pruning procedure. We find that the use of WFSTs rather than k-best lists requires less pruning in translation search, resulting in fewer search errors, better parameter optimization, and improved translation performance. The direct generation of translation lattices in the target language can improve subsequent rescoring procedures, yielding further gains when applying long-span language models and Minimum Bayes Risk decoding. We also provide insights as to how to control the size of the search space defined by hierarchical rules. We show that shallow-n grammars, low-level rule catenation, and other search constraints can help to match the power of the translation system to specific language pairs.

This content is only available as a PDF.

Author notes

*

University of Cambridge, Department of Engineering. CB2 1PZ Cambridge, U.K. E-mail: {ad465,gwb24,wjb31}@eng.cam.ac.uk.

**

University of Vigo, Department of Signal Processing and Communications, 36310 Vigo, Spain. E-mail: {giglesia,erbanga}@gts.tsc.uvigo.es.