Hi,

Typically, MLP is being trained using backpropagation algorithm run in many epochs. Running backprop in only one epoch is not enough to build a reasonable model. Unfortunately, I do not see the number of epoch as an input parameter to the MLP training job in Mahout. Is it a deliberate decision or it is only due to the immaturity of the implementation?

Regards,
Karol

Reply via email to