I can't get any output with my syntactic baseline. Will anybody know what maybe 
wrong?

I trained a string2tree baseline. Got a rule-table such like this:

% [X][TH] 相当于 [X][RA] [X] ||| [X][TH] is [X][RA] [PROP] ||| 1.67586e-05 
4.51106e-08 0.0475 0.177966 ||| 1-0 2-1 3-2 ||| 2083.76 0.735177 0.735177 ||| 
|||
% [X][TH] 相当于 [X][RA] 。 [X] ||| [X][TH] is [X][RA] [PROP] ||| 2.06243e-06 
6.65709e-09 0.0475 0.177966 ||| 1-0 2-1 3-2 ||| 2083.76 0.0904762 0.0904762 ||| 
|||
% [X][TH] 相当于 [X][RA] 于 [X] ||| [X][TH] is [X][RA] [PROP] ||| 1.30259e-06 
4.61662e-11 0.0475 0.177966 ||| 1-0 2-1 3-2 ||| 2083.76 0.0571429 0.0571429 ||| 
|||

And I always got no output when I using this baseline.
for example :

input  3 月

output on screen:
3 月
Translating line 2  in thread id 47362102691584
Line 2: Initialize search took 0.000 seconds total
Translating: <s> 3 月 </s>  ||| [0,0]=X (1) [0,1]=X (1) [0,2]=X (1) [0,3]=X (1) 
[1,1]=X (1) [1,2]=X (1) [1,3]=X (1) [2,2]=X (1) [2,3]=X (1) [3,3]=X (1)

  0   1   2   3
  0   8   8   0
    0  29   0
      0   0
        0
Line 2: Additional reporting took 0.000 seconds total
Line 2: Translation took 0.003 seconds total
Translation took 0.000 seconds

Do you know what maybe wrong with my baseline?

I run decoder with 
moses_chart -T -f moses.ini

trainning this baseline with:
train_model.pl  --glue-grammar  --target-syntax -max-phrase-length=999 
--extract-options="--NonTermConsecSource --MinHoleSource 1 --MaxSpan 999 
--MinWords 0 --MaxNonTerm 3" -lm 0:5:lmsri.en --corpustrain_case --f zh --e en 
-root-dir train_dir -external-bin-dir bin -mgiza -mgiza-cpus 6   -cores 10    
--alignment grow-diag-final-and -score-options ' --GoodTuring' 

the moses.ini as following:
#########################

# input factors
[input-factors]
0

# mapping steps
[mapping]
0 T 0

[cube-pruning-pop-limit]
1000

[non-terminals]
X

[search-algorithm]
3

[inputtype]
3

[max-chart-span]
20
1000

# feature functions
[feature]
UnknownWordPenalty
WordPenalty
PhrasePenalty
PhraseDictionaryMemory name=TranslationModel0 num-features=4 
path=/home/workspace/moses-fbis-case-s2t-ch2en/training_dir/model/rule-table.gz 
input-factor=0 output-factor=0
KENLM name=LM0 factor=0 path=/home/workspace/data-lm/lmsri.en order=5

# dense weights for feature functions
[weight]
UnknownWordPenalty0= 1
WordPenalty0= -1
PhrasePenalty0= 0.2
TranslationModel0= 0.2 0.2 0.2 0.2
LM0= 0.5



Shi Huaxing
MI&T Lab
School of Computer Science and Technology
Harbin Institute of Technology
_______________________________________________
Moses-support mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to