Hi,
Thank  you for your reply.  I run the following command before tunning step.
/home/arezoo1/mosesdecoder-master/mosesdecoder-master/bin/CreateProbingPT2 
--input-pt /home/arezoo1/emlla/train/model/rule-table.gz  --output-dir 
/home/arezoo1/emlla/train/model/rule-table.1.gz


and change the moses.ini to the output-dir.

PhraseDictionaryMemory name=TranslationModel0 num-features=4 
path=/home/arezoo1/emlla/train/model/rule-table.1.gz input-factor=0 
output-factor=0
PhraseDictionaryMemory name=TranslationModel1 num-features=1 
path=/home/arezoo1/emlla/train/model/glue-grammar input-factor=0 
output-factor=0 tuneable=true


and run tunning step as follows:

  
/home/arezoo1/mosesdecoder-master/mosesdecoder-master/scripts/training/mert-moses.pl
  /home/arezoo1/emlla/tune/nc-dev2007.true.fr 
/home/arezoo1/emlla/tune/nc-dev2007.true.en  
/home/arezoo1/mosesdecoder-master/mosesdecoder-master/bin/moses2 
/home/arezoo1/emlla/train/model/moses.ini --rootdir 
/home/arezoo1/mosesdecoder-master/moesdecoder-master/scripts --decoder-flags 
"-v 0" --no-filter-phrase-table --inputtype 3     --batch-mira 
--return-best-dev  --filtercmd 
'/home/arezoo1/mosesdecoder-master/mosesdecoder-master/scripts/training/filter-model-given-input.pl
  --batch-mira-args '-J 300' 1>&mert.out
It creates   features.list,run1.dense, run1.best100.out, run1.moses.ini, 
run1.out.
the mart.out is contain:


Using SCRIPTS_ROOTDIR: 
/home/arezoo1/mosesdecoder-master/mosesdecoder-master/scripts
Assuming --mertdir=/home/arezoo1/mosesdecoder-master/mosesdecoder-master/bin
Using cached features list: ./features.list
MERT starting values and ranges for random generation:
    LM0 =   0.500 ( 0.00 ..  1.00)
  WordPenalty0 =  -1.000 ( 0.00 ..  1.00)
  PhrasePenalty0 =   0.200 ( 0.00 ..  1.00)
  TranslationModel0 =   0.200 ( 0.00 ..  1.00)
  TranslationModel0 =   0.200 ( 0.00 ..  1.00)
  TranslationModel0 =   0.200 ( 0.00 ..  1.00)
  TranslationModel0 =   0.200 ( 0.00 ..  1.00)
  TranslationModel1 =   1.000 ( 0.00 ..  1.00)
featlist: LM0=0.500000 
featlist: WordPenalty0=-1.000000 
featlist: PhrasePenalty0=0.200000 
featlist: TranslationModel0=0.200000 
featlist: TranslationModel0=0.200000 
featlist: TranslationModel0=0.200000 
featlist: TranslationModel0=0.200000 
featlist: TranslationModel1=1.000000 
run 1 start at Sun Sep  2 07:51:49 EDT 2018
Parsing --decoder-flags: |-threads 8 -v 0|
Saving new config to: ./run1.moses.ini
Saved: ./run1.moses.ini
Normalizing lambdas: 0.500000 -1.000000 0.200000 0.200000 0.200000 0.200000 
0.200000 1.000000
DECODER_CFG = -weight-overwrite 'PhrasePenalty0= 0.057143 TranslationModel0= 
0.057143 0.057143 0.057143 0.057143 WordPenalty0= -0.285714 LM0= 0.142857 
TranslationModel1= 0.285714'
Executing: /home/arezoo1/mosesdecoder-master/mosesdecoder-master/bin/moses2 
-threads 8 -v 0  -config /home/arezoo1/emlla/train/model/moses.ini -inputtype 3 
-weight-overwrite 'PhrasePenalty0= 0.057143 TranslationModel0= 0.057143 
0.057143 0.057143 0.057143 WordPenalty0= -0.285714 LM0= 0.142857 
TranslationModel1= 0.285714'  -n-best-list run1.best100.out 100 distinct  
-input-file /home/arezoo1/emlla/tune/nc-dev2007.true.fr > run1.out 
Executing: /home/arezoo1/mosesdecoder-master/mosesdecoder-master/bin/moses2 
-threads 8 -v 0  -config /home/arezoo1/emlla/train/model/moses.ini -inputtype 3 
-weight-overwrite 'PhrasePenalty0= 0.057143 TranslationModel0= 0.057143 
0.057143 0.057143 0.057143 WordPenalty0= -0.285714 LM0= 0.142857 
TranslationModel1= 0.285714'  -n-best-list run1.best100.out 100 distinct  
-input-file /home/arezoo1/emlla/tune/nc-dev2007.true.fr > run1.out
1-10.20.2 0.2 0.2 0.210.5(1) run decoder to produce n-best lists
params = -threads 8 -v 0
decoder_config = -weight-overwrite 'PhrasePenalty0= 0.057143 TranslationModel0= 
0.057143 0.057143 0.057143 0.057143 WordPenalty0= -0.285714 LM0= 0.142857 
TranslationModel1= 0.285714'
Starting...
START featureFunctions.Load()
Loading WordPenalty0
Finished loading WordPenalty0
Loading PhrasePenalty0
Finished loading PhrasePenalty0
Loading LM0
Finished loading LM0
Loading UnknownWordPenalty0
Finished loading UnknownWordPenalty0
Loading TranslationModel0
Finished loading TranslationModel0
Loading TranslationModel1
Finished loading TranslationModel1
START LoadMappings()
END LoadMappings()
END LoadDecodeGraphBackoff()
Loaded : [0.00543796] seconds
RUN BATCH
terminate called after throwing an instance of 'util::Exception'
  what():  moses2/SCFG/nbest/KBestExtractor.cpp:42 in void 
Moses2::SCFG::KBestExtractor::OutputToStream(std::stringstream&) threw 
util::Exception because `lastStack.GetColl().size() != 1'.
Only suppose to be 1 hypo coll in last stack
Aborted (core dumped)
Exit code: 134
The decoder died. CONFIG WAS -weight-overwrite 'PhrasePenalty0= 0.057143 
TranslationModel0= 0.057143 0.057143 0.057143 0.057143 WordPenalty0= -0.285714 
LM0= 0.142857 TranslationModel1= 0.285714'


I really appreciate your help in resolving the problem.

Arezoo
  On Thursday, August 23, 2018, 6:22:14 AM EDT, Hieu Hoang 
<hieuho...@gmail.com> wrote: 
 
 you need to binarize with CreateProbingPT if you use moses2
if you use CreateOnDiskPt then you must use the moses decoder


Hieu Hoanghttp://statmt.org/hieu

On Thu, 23 Aug 2018 at 09:02, Arezoo Arjomand <arezooarjom...@yahoo.com> wrote:


Hi,

I installed successfully moses decoder on server. I run syntax grammer 
extraction as follow :

 

/home/vps/mosesdecoder/scripts/training/train-model.perl  \

-root-dir train  \

-corpus /home/vps/emlla//ttt/news-commentary-v8.fr-en.clean  \

-f fr -e en -alignment grow-diag-final-and  -hierarchical    -glue-grammar      
 \  -lm 0:3:/home/vps/emlla//ttt/news-commentary-v8.fr-en.blm.en  \

-mgiza -mgiza-cpus 4  \

-external-bin-dir  /homevps/mosesdecoder/tools >&training2.out

  


the file which created is :

 

train → corpus , giza.en-fr, giza.fr-en, model

model → aligned.grow-diag-final-and, extract.sorted.gz, lex.e2f,  moses.ini, 
extract.inv.sorted.gz,        glue-grammar, lex.f2e  rule-table.gz

  


to tune the model the following command is run:

 

/home/vps/mosesdecoder/scripts/training/mert-moses.pl  
/home/vps/emlla/nc-dev2007.en.true.fr  /home/vps/emlla/nc-dev2007.en.true.en  
/home/vps/mosesdecoder/bin/moses2 /home/vps/emlla/train/model/moses.ini 
--rootdir  /home/vps/mosesdecoder/scripts --decoder-flags "-v 0" 
--no-filter-phrase-table --inputtype 3     --batch-mira --return-best-dev  
--filtercmd 
'/home/vps/mosesdecoder/scripts/training/filter-model-given-input.pl -Binarizer 
"CreateOnDiskPt 1 1 5 100 2" ' --batch-mira-args '-J 300' --decoder-flags 
'-threads 8 -v 0' 1>&mert.out                                     

but it dies in following step:                                                  
                                                 

Finished loading LM0

Loading UnknownWordPenalty0

Finished loading UnknownWordPenalty0

Loading TranslationModel0

Finished loading TranslationModel0

Loading TranslationModel1

Finished loading TranslationModel1

START LoadMappings()

END LoadMappings()

END LoadDecodeGraphBackoff()

Loaded : [0.0490292] seconds

RUN BATCH

terminate called after throwing an instance of 'util::Exception'

  what():  moses2/SCFG/nbest/KBestExtractor.cpp:42 in void Moses2::SCFG::KBestE$

Only suppose to be 1 hypo coll in last stack

Aborted (core dumped)

Exit code: 134
The decoder died. CONFIG WAS -weight-overwrite 'TranslationModel1= 0.285714 Phr$
how can I fix it? 
Thank  you for your attention

 
    
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to