yes, but it only supports ProbingPT.
And it's best if you run the program
addLexROtoPT
to merge the pt and lex reordering model. There is no equivalent in Moses(1)
Hieu Hoang
http://moses-smt.org/
On 11 December 2017 at 11:51, liling tan wrote:
> Thank you Hieu for Moses2 tips!
>
> BTW, is
Thank you Hieu for Moses2 tips!
BTW, is Moses2 backwards compatible for models trained with old Moses?
Regards,
Liling
On Mon, Dec 11, 2017 at 7:39 PM, Hieu Hoang wrote:
> if you want fast decoding with more than 16 threads, use Moses2.
>http://www.statmt.org/moses/?n=Site.Moses2
>
> Hieu
if you want fast decoding with more than 16 threads, use Moses2.
http://www.statmt.org/moses/?n=Site.Moses2
Hieu Hoang
http://moses-smt.org/
On 11 December 2017 at 09:20, liling tan wrote:
> Dear Moses community/developers,
>
> I have a question on how to handle large models created using m
Dear Moses community/developers,
I have a question on how to handle large models created using moses.
I've a vanilla phrase-based model with
- PhraseDictionary num-features=4 input-factor=0 output-factor=0
- LexicalReordering num-features=6 input-factor=0 output-factor=0
- KENLM order=5