Hi Viktor,
As far I know, you can use the wrapper scripts (treetagger or mxpost)
located at /mosesdecoder/scripts/training/wrappers for this purpose.
Regards
Massinissa
2014-03-20 17:37 GMT+01:00 Viktor Pless :
> Hi, what tools can be used to lemmatize/POS-tag/etc. a corpus in moses
> format
it looks like boost hasn't been installed properly.
What is the exact command you used to compile Moses? How did you install
boost? What OS are you running on?
On 20 March 2014 17:40, Martin McCaffery wrote:
> Hi all,
>
> I've been trying to get Moses up and running locally and been running in
where did you download your source code from, and when did you download it?
There 1 error I see and I think it was fixed a few weeks ago. If you
download the latest version of Moses from github, it should be ok:
git clone https://github.com/moses-smt/mosesdecoder.git
On 20 March 2014 14:06, T
Hi all,
I've been trying to get Moses up and running locally and been running into
problems. At first I had issues with trying to get Boost working on my
local machine, but after tweaking Boost the errors have got smaller.
Now when running ./bjam I seem to simply be getting g++ compile and testin
Hi Ken,
Yes, different models, different languages. Thanks! Yes lazy loading is
absolutely dead slow.
Lexi
On Thu, Mar 20, 2014 at 4:53 PM, Kenneth Heafield wrote:
> Hi Lexi,
>
> I take it that these models are different, not the same model
> loaded
> into each process (in which case t
Hi Lexi,
I take it that these models are different, not the same model loaded
into each process (in which case they would have shared). I'd really
recommend trying to compress things more (e.g. trie -a 64 -q 8) before
going to lazy loading.
Kenneth
On 03/20/14 08:13, Marcin Junczys-Dowm
Hi, what tools can be used to lemmatize/POS-tag/etc. a corpus in moses
format (with the pipes)? I need them regarding Spanish, English, Hungarian.
Thanks in advance.
Viktor
___
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailma
Hi,
since KenLM uses shared memory, four instances should take up the same
amount of memory as only one instance (ran yesterday 8 instances with 8
threads each with a 99GB LM on a 128 GB machine). If the model fits into
memory for a single instance it should work if you have enough memory
left
Hi Vishal
There's no other code in Moses for web page translation. It just takes a
bit more work to integrate it with the recent versions of Moses,
cheers - Barry
On 19/03/14 17:04, Vishal Goyal(विशाल गोयल) wrote:
> Thanks for kind reply.
> Then kindly suggest how can we make the web translatio
I have found the answer on the kenlm web page and it seems to be working:
Full or lazy loading
KenLM supports lazy loading via mmap. This allows you to further reduce
memory usage, especially with trie which has good memory locality. In
Moses, this is controlled by the language model number in mo
Hi there,
I want to run 4 MT servers at the same time on a machine with limited
memory. Kenlm seems to reserve the amount of memory which the language
model would have taken if it had been loaded into memory. So I don't have
enough memory to run all these servers and the machine grinds to a halt i
the command I type is: ./bjam -j8
many thanks for the support
-tiansi dong
build.log.gz
Description: GNU Zip compressed data
___
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support
maybe try the code in
contrib/server/Translation-web
On 19 March 2014 17:04, Vishal Goyal(विशाल गोयल) wrote:
> Thanks for kind reply.
> Then kindly suggest how can we make the web translation reality? Which new
> we need to use...
>
>
>
> On Wed, Mar 19, 2014 at 2:07 PM, Barry Haddow
> wrot
Hi all
I am trying to develop a specific segmenter. The goal is to send Moses
decoder sentences instead of large textual flows syntactically incoherent.
In order to integrate this segmenter in an automatic workflow of documents
translation. I would define as a sentence delimiter any character that
Dear all,
I've managed to train a hierarchical model using the following command :
nohup /mosesdecoder/scripts/training/train-model.perl --hierarchical
--glue-grammar --score-options="--GoodTuring" -root-dir
/disque2/Preparation/syntactic/hierarchical_PSCT -corpus
/disque2/Preparation/backoff/PS
** Apologies for cross-posting **
COLING 2014 Call for System Demonstrations
The 25th International Conference on Computational Linguistics, August 23 - 29,
2014, Dublin, Ireland
http://www.coling-2014.org
Important dates
19 May 2014: Paper submission deadline
23 June 2014: Auth
16 matches
Mail list logo