[ 
https://issues.apache.org/jira/browse/MAHOUT-1557?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14007235#comment-14007235
 ] 

Sebastian Schelter commented on MAHOUT-1557:
--------------------------------------------

Karol, your patch contains some errors, e.g. the variable position is set but 
never read in RunMultilayerPerceptron.

Furthermore, NeuralNetwork converts the input to a DenseVector internally in 
getOutput(), so you also have to modify that code.

> Add support for sparse training vectors in MLP
> ----------------------------------------------
>
>                 Key: MAHOUT-1557
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1557
>             Project: Mahout
>          Issue Type: Improvement
>          Components: Classification
>            Reporter: Karol Grzegorczyk
>            Priority: Minor
>              Labels: mlp
>             Fix For: 1.0
>
>         Attachments: mlp_sparse.diff
>
>
> When the number of input units of MLP is big, it is likely that input vector 
> will be sparse. It should be possible to read input files in a sparse format.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to