maybe try
-encoding None
On 08/12/2016 19:44, Shubham Khandelwal wrote:
Hi Hieu,
Thanks for your reply.
Yes, I have used the absolute path and also I tried with -T but it did
not work.
Is there any other solution to this problem.
Btw, Can anybody please upload the compact model of all
Hi Hieu,
Thanks for your reply.
Yes, I have used the absolute path and also I tried with -T but it did not
work.
Is there any other solution to this problem.
Btw, Can anybody please upload the compact model of all pre-made models as
this will take less space and also it will be very fast during
the previous email you referred to says that the directory
* binarised-model/*
*must exist before you run it, otherwise it will segfault. I would also use
absolute path to make sure, ie. not *
*binarised-model/phrase-table *
*but*
* /home/shubham/moses/binarised-model/phrase-table *
*The
Hello Hakimeh,
the branch https://github.com/moses-smt/mosesdecoder/tree/nmt-hybrid
supports NMT (which is basically an RNN conditioned on the source text)
as a feature function in Moses. It is described in this paper:
@inproceedings{junczysdowmunt-dwojak-sennrich:2016:WMT,
address =
Hello,
This is just the reminder of my previous email.
Thanking You.
Regards,
Shubham
On Thu, Dec 8, 2016 at 9:04 AM, Shubham Khandelwal
wrote:
> Hello,
>
> I have just downloaded phrase-table.2.gz (18GB) de-en model
> and phrase-table.3.gz (22GB) fr-en model from the
Dear Moses Community,
I'm searching for some RNN-based features added to Moses decoder or
used in reranking Moses' n-best list.
It seems Moses have some (LM) features based on Feedforward neural
networks but I haven't find any implementation for recurrent neural
networks in Moses.
Apparently