Hi,
I am using inc-giza-pp, xmlrpc-c. All of them successfully compiled and my
xmlrpc in installed in a local directory and the library files are located
in /home/sdandapat/Moses/xmlrpc-c/lib64.

Moses decoder also compiled fine with inc-giza-pp.

During building the translation model, it fails with following error:

/home/sdandapat/Moses/mosesdecoder/tools/GIZA++: error while loading shared
libraries: libxmlrpc_server_abyss++.so.8: cannot open shared object file:
No such file or directory
Exit code: 127
ERROR: Giza did not produce the output file
/home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl.Ahmm.5. Is your
corpus clean (reasonably-sized sentences)? at
/home/sdandapat/Moses/mosesdecoder/scripts/training/train-model.perl line
1199.

I am attaching the entire log. What am I doing wrong?
I found similar post in
http://blog.gmane.org/gmane.comp.nlp.moses.user/month=20111201 but could
not find the end solution.

Thanks and regards,
sandipan


--------------------------------
Sandipan Dandapat
Postdoctoral Researcher
CNGL, School of Computing
Dublin City University
Google Scholar Profile:
http://scholar.google.co.in/citations?user=DWD_FiQAAAAJ&hl=en
Tokenizer Version 1.1
Language: en
Number of threads: 1
Tokenizer Version 1.1
Language: pl
Number of threads: 1
clean-corpus.perl: processing mtdata/train.tok.en & .pl to mtdata/train.clean, cutoff 1-100
.....
Input sentences: 50000  Output sentences:  50000
mkdir: cannot create directory ‘lm’: File exists
Pruning === 1/5 Counting and sorting n-grams ===
Reading /home/sdandapat/Moses_retrain/EnPl/mtdata/train.clean.pl
----5---10---15---20---25---30---35---40---45---50---55---60---65---70---75---80---85---90---95--100
****************************************************************************************************
Unigram tokens 917274 types 51774
=== 2/5 Calculating and sorting adjusted counts ===
Chain sizes: 1:621288 2:2931873280 3:5497262592
Statistics:
1 51774 D1=0.607105 D2=1.0428 D3+=1.51615
2 379279 D1=0.801548 D2=1.15112 D3+=1.4064
3 655686 D1=0.884806 D2=1.19448 D3+=1.37341
Memory estimate for binary LM:
type       kB
probing 21729 assuming -p 1.5
probing 24154 assuming -r models -p 1.5
trie     9558 without quantization
trie     5545 assuming -q 8 -b 8 quantization 
trie     8990 assuming -a 22 array pointer compression
trie     4976 assuming -a 22 -q 8 -b 8 array pointer compression and quantization
=== 3/5 Calculating and sorting initial probabilities ===
Chain sizes: 1:621288 2:6068464 3:13113720
=== 4/5 Calculating and writing order-interpolated probabilities ===
Chain sizes: 1:621288 2:6068464 3:13113720
Chain sizes: 1:621288 2:6068464 3:13113720
=== 5/5 Writing ARPA model ===
Name:lmplz	VmPeak:8778804 kB	VmRSS:23180 kB	RSSMax:1931028 kB	user:2.96355	sys:1.23781	CPU:4.20136	real:4.89953
Reading lm/train.arpa.pl
----5---10---15---20---25---30---35---40---45---50---55---60---65---70---75---80---85---90---95--100
****************************************************************************************************
SUCCESS
Using SCRIPTS_ROOTDIR: /home/sdandapat/Moses/mosesdecoder/scripts
Using single-thread GIZA
(1) preparing corpus @ Wed Aug 13 12:44:56 IST 2014
Executing: mkdir -p /home/sdandapat/Moses_retrain/EnPl/train/corpus
(1.0) selecting factors @ Wed Aug 13 12:44:56 IST 2014
(1.1) running mkcls  @ Wed Aug 13 12:44:56 IST 2014
/home/sdandapat/Moses/mosesdecoder/tools/mkcls -c50 -n2 -p/home/sdandapat/Moses_retrain/EnPl/mtdata/train.clean.en -V/home/sdandapat/Moses_retrain/EnPl/train/corpus/en.vcb.classes opt
  /home/sdandapat/Moses_retrain/EnPl/train/corpus/en.vcb.classes already in place, reusing
(1.1) running mkcls  @ Wed Aug 13 12:44:56 IST 2014
/home/sdandapat/Moses/mosesdecoder/tools/mkcls -c50 -n2 -p/home/sdandapat/Moses_retrain/EnPl/mtdata/train.clean.pl -V/home/sdandapat/Moses_retrain/EnPl/train/corpus/pl.vcb.classes opt
  /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl.vcb.classes already in place, reusing
(1.2) creating vcb file /home/sdandapat/Moses_retrain/EnPl/train/corpus/en.vcb @ Wed Aug 13 12:44:56 IST 2014
(1.2) creating vcb file /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl.vcb @ Wed Aug 13 12:44:56 IST 2014
(1.3) numberizing corpus /home/sdandapat/Moses_retrain/EnPl/train/corpus/en-pl-int-train.snt @ Wed Aug 13 12:44:57 IST 2014
  /home/sdandapat/Moses_retrain/EnPl/train/corpus/en-pl-int-train.snt already in place, reusing
(1.3) numberizing corpus /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl-en-int-train.snt @ Wed Aug 13 12:44:57 IST 2014
  /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl-en-int-train.snt already in place, reusing
(2) running giza @ Wed Aug 13 12:44:57 IST 2014
(2.1a) running snt2cooc en-pl @ Wed Aug 13 12:44:57 IST 2014

Executing: mkdir -p /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl
Executing: /home/sdandapat/Moses/mosesdecoder/tools/snt2cooc.out /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl.vcb /home/sdandapat/Moses_retrain/EnPl/train/corpus/en.vcb /home/sdandapat/Moses_retrain/EnPl/train/corpus/en-pl-int-train.snt > /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl.cooc
line 1000
line 2000
line 3000
line 4000
line 5000
line 6000
line 7000
line 8000
line 9000
line 10000
line 11000
line 12000
line 13000
line 14000
line 15000
line 16000
line 17000
line 18000
line 19000
line 20000
line 21000
line 22000
line 23000
line 24000
line 25000
line 26000
line 27000
line 28000
line 29000
line 30000
line 31000
line 32000
line 33000
line 34000
line 35000
line 36000
line 37000
line 38000
line 39000
line 40000
line 41000
line 42000
line 43000
line 44000
line 45000
line 46000
line 47000
line 48000
line 49000
line 50000
END.
(2.1b) running giza en-pl @ Wed Aug 13 12:45:07 IST 2014
/home/sdandapat/Moses/mosesdecoder/tools/GIZA++  -CoocurrenceFile /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl.cooc -c /home/sdandapat/Moses_retrain/EnPl/train/corpus/en-pl-int-train.snt -hmmdumpfrequency 5 -hmmiterations 5 -m1 5 -m2 0 -m3 0 -m4 0 -m5 0 -model1dumpfrequency 0 -model2dumpfrequency 0 -model345dumpfrequency 0 -model4smoothfactor 0.4 -nodumps 0 -nsmooth 4 -o /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl -onlyaldumps 1 -p0 0.999 -s /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl.vcb -t /home/sdandapat/Moses_retrain/EnPl/train/corpus/en.vcb
Executing: /home/sdandapat/Moses/mosesdecoder/tools/GIZA++  -CoocurrenceFile /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl.cooc -c /home/sdandapat/Moses_retrain/EnPl/train/corpus/en-pl-int-train.snt -hmmdumpfrequency 5 -hmmiterations 5 -m1 5 -m2 0 -m3 0 -m4 0 -m5 0 -model1dumpfrequency 0 -model2dumpfrequency 0 -model345dumpfrequency 0 -model4smoothfactor 0.4 -nodumps 0 -nsmooth 4 -o /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl -onlyaldumps 1 -p0 0.999 -s /home/sdandapat/Moses_retrain/EnPl/train/corpus/pl.vcb -t /home/sdandapat/Moses_retrain/EnPl/train/corpus/en.vcb
/home/sdandapat/Moses/mosesdecoder/tools/GIZA++: error while loading shared libraries: libxmlrpc_server_abyss++.so.8: cannot open shared object file: No such file or directory
Exit code: 127
ERROR: Giza did not produce the output file /home/sdandapat/Moses_retrain/EnPl/train/giza.en-pl/en-pl.Ahmm.5. Is your corpus clean (reasonably-sized sentences)? at /home/sdandapat/Moses/mosesdecoder/scripts/training/train-model.perl line 1199.
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to