Hi,

I added the training script and some documentation:
http://www.statmt.org/mosesdev/?n=Moses.AdvancedFeatures#ntoc25

Let me know, if this actually works.

-phi

On Mon, Oct 25, 2010 at 1:15 PM, Ondrej Bojar <bo...@ufal.mff.cuni.cz> wrote:
> Hi, Philipp,
>
> I was wondering what that secret model was... Is there any brief
> documentation of what the Moses code expects to load for this model?
>
> The training of this discriminative word lexicon can be heavily
> parallelized. Is there any such implementation available, despite not
> being efficient enough?
>
> Cheers, O.
>
> Philipp Koehn wrote:
>> Hi,
>>
>> I am not familiar with that, but somewhat related is
>> Arne Mauser's global lexical model, which also exists
>> as a secret feature in Moses (secret because no
>> effiencient training exists):
>>
>> Citation:
>> A. Mauser, S. Hasan, and H. Ney. Extending Statistical Machine
>> Translation with Discriminative and Trigger-Based Lexicon Models. In
>> Conference on Empirical Methods in Natural Language Processing
>> (EMNLP), Singapore, August 2009.
>> http://www-i6.informatik.rwth-aachen.de/publications/download/628/MauserArneHasanSav%7Bs%7DaNeyHermann--ExtendingStatisticalMachineTranslationwithDiscriminativeTrigger-BasedLexiconModels--2009.pdf
>>
>> -phi
>>
>>
>> On Fri, Oct 22, 2010 at 7:02 PM, Francis Tyers <fty...@prompsit.com> wrote:
>>> Hello all,
>>>
>>> I have a rather strange request. Does anyone know of any papers (or
>>> impementations) on bag-of-words language models ? That is, a language
>>> model which does not take into account the order in which the words
>>> appear in an ngram, so if you have the string 'police chief of' in your
>>> model, you will get a result for both 'chief of police' and 'police
>>> chief of'. I have thought of using IRSTLM or some generic model and
>>> scoring all the permutations, but wondered if there was a more efficient
>>> implementation already in existence. I have searched without much luck
>>> in Google, but perhaps I am searching with the wrong words.
>>>
>>> Best regards,
>>>
>>> Fran
>>>
>>> _______________________________________________
>>> Moses-support mailing list
>>> Moses-support@mit.edu
>>> http://mailman.mit.edu/mailman/listinfo/moses-support
>>>
>> _______________________________________________
>> Moses-support mailing list
>> Moses-support@mit.edu
>> http://mailman.mit.edu/mailman/listinfo/moses-support
>
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support

Reply via email to