Dear Moses-Team,

I am trying to translate two short sentences included in the same file from 
German into English using a “Phrase-based Model”. The first sentence (das auto 
wurde verkauft) is translated correctly, while the second is partly translated:

I receive as a result for “ich habe das auto verkauft”
Ich|UNK|UNK|UNK habe|UNK|UNK|UNK the car sold  [11111]   [total=-203.330]   
core=(-200.000, -5.000, 5.000, 0.000, 0.000, 0.000, 0.000, 0.000, -18.660)

I tried to modify the training data in different ways, and at last included the 
exact sentence (along with its translation) in the training data (see 
attachment). But, I still get the same result.

Do I need to use a “Factored Translation Model” instead of the “Phrase-based 
Model” to be able to translate this sentence? If yes, I find here 
http://www.statmt.org/moses/?n=Moses.FactoredTutorial explanation of how to 
train Factored Models. Could you please tell me, where can I find information 
about 
1.      how to prepare the training data with additional factors, before 
training the Factored Model?
2.      how to train the Language Model that considers the POS?

I currently use KenLM and Giza++. 

Thanks a lot for your support.

Kind regards,
Shaimaa

Attachment: Training data.docx
Description: MS-Word 2007 document

_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to