Hello Hakimeh,

the branch https://github.com/moses-smt/mosesdecoder/tree/nmt-hybrid supports NMT (which is basically an RNN conditioned on the source text) as a feature function in Moses. It is described in this paper:

@inproceedings{junczysdowmunt-dwojak-sennrich:2016:WMT,
    address = "Berlin, Germany",
author = "Junczys-Dowmunt, Marcin and Dwojak, Tomasz and Sennrich, Rico", booktitle = "{Proceedings of the First Conference on Machine Translation}",
    month = "August",
    pages = "319--325",
    publisher = "Association for Computational Linguistics",
title = "{The AMU-UEDIN Submission to the WMT16 News Translation Task: Attention-based NMT Models as Feature Functions in Phrase-based SMT}",
    url = "http://www.aclweb.org/anthology/W/W16/W16-2316";,
    year = "2016"
}

I'm afraid that documentation is currently lacking though.

best wishes,
Rico

On 08/12/16 08:34, Shafagh Fadaee wrote:
Dear Moses Community,
I'm searching for some RNN-based features added to Moses decoder or used in 
reranking Moses' n-best list.
It seems Moses have some (LM) features based on Feedforward neural networks but 
I haven't find any implementation for recurrent neural networks in Moses.
Apparently there are many efforts in this field but I have trouble finding 
released codes to use as guideline in my work.
I would be happy to hear your suggestions.
Best regards,
Hakimeh Fadaei


_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to