Hi,

I trained a chinese to spanish unfacored model and all worked perfectly.
But when I try to train a factored model for the same task I have some
trouble while tuning. The factors I'm using are only words for chinese and
words, lemmas and POS tags for spanish.

Training seems to finish correctly and the phrase tables shows all the
factors but when tuning t it only does 2 runs and prints a message saying
that weights have not change in the last run. Leaving the original weights.
Also when translating, the BLEU obtained is worse than the obtained with
the not factored model.


These are my calls for training and tuning the model:

$SCRIPTS_ROOTDIR/training/train-model.perl \
    -external-bin-dir $GIZA_DIR/mgiza-bin -mgiza \
    --corpus $WORKING_DIR/train/train \
    --alignment grow-diag-final-and \
    --score-options '--GoodTuring' \
    --root-dir $WORKING_DIR/baseline/ \
    --f zh --e es \
    --lm 0:5:$WORKING_DIR/baseline/lm/words.lm.es:0 \
    --translation-factors 0-0,1,2 \
    --reordering msd-bidirectional-fe \
    --reordering-factors 0-0 \

$MOSES_SCRIPTS/training/mert-moses.pl  \
  $WORKING_DIR/dev/dev.zh \
 $WORKING_DIR/dev/dev.es \
 $MOSES_DIR/moses-cmd/bin/gcc-4.8.5/release/link-static/threading-multi/moses
\
 $WORKING_DIR/baseline/model/moses.ini \
 --nbest 100 \
--working-dir $WORKING_DIR/baseline/tuning/ \
--decoder-flags "-drop-unknown -mbr -threads 24 -mp -v 0" \
 --rootdir $MOSES_SCRIPTS \
--mertdir $MOSES_DIR/bin/ \
-threads 24  \
 --filtercmd '/veu4/usuaris24/xtrans/mosesdecoder/scripts/training/
filter-model-given-input.pl'

/veu4/usuaris24/smt/softlic/mosesdecoder/scripts//ems/support/reuse-weights.perl
\
 $WORKING_DIR/baseline/tuning/moses.ini <
$WORKING_DIR/baseline/model/moses.ini >
$WORKING_DIR/baseline/tuning/moses.weight-reused.ini


Best regards,

Carlos
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to