Re: [Moses-support] Deploying large models

2017-12-11 Thread Hieu Hoang
yes, but it only supports ProbingPT. And it's best if you run the program addLexROtoPT to merge the pt and lex reordering model. There is no equivalent in Moses(1) Hieu Hoang http://moses-smt.org/ On 11 December 2017 at 11:51, liling tan wrote: > Thank you Hieu for Moses2 tips! > > BTW, is

Re: [Moses-support] Deploying large models

2017-12-11 Thread liling tan
Thank you Hieu for Moses2 tips! BTW, is Moses2 backwards compatible for models trained with old Moses? Regards, Liling On Mon, Dec 11, 2017 at 7:39 PM, Hieu Hoang wrote: > if you want fast decoding with more than 16 threads, use Moses2. >http://www.statmt.org/moses/?n=Site.Moses2 > > Hieu

Re: [Moses-support] Deploying large models

2017-12-11 Thread Hieu Hoang
if you want fast decoding with more than 16 threads, use Moses2. http://www.statmt.org/moses/?n=Site.Moses2 Hieu Hoang http://moses-smt.org/ On 11 December 2017 at 09:20, liling tan wrote: > Dear Moses community/developers, > > I have a question on how to handle large models created using m

[Moses-support] Deploying large models

2017-12-11 Thread liling tan
Dear Moses community/developers, I have a question on how to handle large models created using moses. I've a vanilla phrase-based model with - PhraseDictionary num-features=4 input-factor=0 output-factor=0 - LexicalReordering num-features=6 input-factor=0 output-factor=0 - KENLM order=5