I do not believe the perceptron trainer is multithreaded. But it should be fast.
On 1/3/17, 1:44 PM, "Damiano Porta" <[email protected]> wrote: Hi WIlliam, thank you! Is there a similar thing for perceptron (perceptron sequence) too? 2017-01-03 19:41 GMT+01:00 William Colen <[email protected]>: > Damiano, > > If you are using Maxent, try TrainingParameters.THREADS_PARAM > > https://opennlp.apache.org/documentation/1.7.0/apidocs/ > opennlp-tools/opennlp/tools/util/TrainingParameters.html#THREADS_PARAM > > William > > 2017-01-03 16:27 GMT-02:00 Damiano Porta <[email protected]>: > > > I am training a new postagger and lemmatizer. > > > > 2017-01-03 19:24 GMT+01:00 Russ, Daniel (NIH/CIT) [E] < > [email protected] > > >: > > > > > Can you be a little more specific? What trainer are you using? > > > Thanks > > > Daniel > > > > > > On 1/3/17, 1:22 PM, "Damiano Porta" <[email protected]> wrote: > > > > > > Hello, > > > I have a very very big training set, is there a way to speed up the > > > training process? I only have changed the Xmx option inside > > bin/opennlp > > > > > > Thanks > > > Damiano > > > > > > > > > > > >
