Re: [Moses-support] Increasing context scope during training

2013-12-10 Thread Dimitris Mavroeidis
Dear Rūdolfs, You must be referring to the language model's n-gram size. If you are using EMS, then you can set order in the LM portion of the configuration file. Setting a higher n-gram order (not more than 5) usually helps, but that depends on various factors, especially the target

Re: [Moses-support] Increasing context scope during training

2013-12-10 Thread Rūdolfs Mazurs
Thanks for the pointer, Dimitri! Although I don't use EMS, I guess that script irstlm/bin/build-lm.sh is responsible for LM part and the option is -n Order of language model (default 3) Thanks again! On O , 2013-12-10 at 15:12 +0200, Dimitris Mavroeidis wrote: Dear Rūdolfs, You must

[Moses-support] Increasing context scope during training

2013-12-09 Thread Rūdolfs Mazurs
Hi all, I am looking to improve quality of translation on my limited corpus. During training process I noticed that ngrams only go up to 3. Is there a way to increase the upper limit on ngram count? And is there a chance it would improve results of translations? -- Rūdolfs Mazurs