thanks. It may not be the phrase-based/hiero mixup as I had suggested.

Maybe the system ran out of RAM.

Before you run tuning, you should also binarize the phrase-table and
lexicalised reordering model, this can be done in the filtering process by
providing the absolute path of the binarizer, eg
   -Binarizer "/home/s0565741/workspace/github/hh/bin/processPhraseTableMin
This shows you an example

http://www.statmt.org/moses/RELEASE-3.0/models/de-en/steps/1/TUNING_filter.1

Binarizing reduces RAM usage by only loading translation rules for the
particular sentence it is translating



* Looking for MT/NLP opportunities *
Hieu Hoang
http://moses-smt.org/


On 17 May 2017 at 11:42, Per Starbäck <starb...@stp.lingfil.uu.se> wrote:

> Hieu Hoang writes:
> > It may be you're mixing up the phrase-based with the syntax/hiero model.
> Can you please
> > attach the file
> > filtered/moses.ini
> > and perhaps the first few lines of the phrase-table that's used
>
> Sure! Here is mert-work/filtered/moses.ini, again just with the
> beginning of absolute paths replaced with "...":
>
> ======================================================================
> # input factors
> [input-factors]
> 0
>
> # mapping steps
> [mapping]
> 0 T 0
>
> [distortion-limit]
> 6
>
> # feature functions
> [feature]
> UnknownWordPenalty
> WordPenalty
> PhrasePenalty
> PhraseDictionaryMemory name=TranslationModel0 num-features=4
> path=.../mert-work/filtered/phrase-table.0-0.1.1.gz input-factor=0
> output-factor=0
> LexicalReordering name=LexicalReordering0 num-features=6
> type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0
> path=.../mert-work/filtered/reordering-table.wbe-msd-bidirectional-fe
> Distortion
> KENLM lazyken=0 name=LM0 factor=0 path=.../lm/news-commentrary-v8.de-en.blm.en
> order=3
>
> # dense weights for feature functions
> [weight]
> UnknownWordPenalty0= 1
> WordPenalty0= -1
> PhrasePenalty0= 0.2
> TranslationModel0= 0.2 0.2 0.2 0.2
> LexicalReordering0= 0.3 0.3 0.3 0.3 0.3 0.3
> Distortion0= 0.3
> LM0= 0.5
> ======================================================================
>
> And here the beginning of mert-work/filtered/phrase-table.0-0.1.1.gz
> uncompressed:
>
> ======================================================================
> $ ||| $100 they earn. ||| 0.25 0.0252101 0.025 1.54085e-10 ||| 0-0 ||| 4
> 40 1 ||| |||
> $ ||| $100 they ||| 0.25 0.0252101 0.025 2.56808e-05 ||| 0-0 ||| 4 40 1
> ||| |||
> $ ||| $100 ||| 0.00819672 0.0252101 0.025 0.0084507 ||| 0-0 ||| 122 40 1
> ||| |||
> $ ||| $100,000 apiece in campaign contributions in ||| 0.5 1 0.025
> 1.16584e-20 ||| 0-1 ||| 2 40 1 ||| |||
> $ ||| $100,000 apiece in campaign contributions ||| 0.5 1 0.025
> 6.46129e-19 ||| 0-1 ||| 2 40 1 ||| |||
> $ ||| $100,000 apiece in campaign ||| 0.5 1 0.025 1.54207e-14 ||| 0-1 |||
> 2 40 1 ||| |||
> ======================================================================
>
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to