[ 
https://issues.apache.org/jira/browse/HAMA-961?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14591495#comment-14591495
 ] 

ChiaHung Lin commented on HAMA-961:
-----------------------------------

If I understand correctly, the setup[1] resembles to [2] except no model 
replicas. In addition parameter server[2] seems to already have source released 
under Apache Licence.

[1]. 
https://docs.google.com/drawings/d/1cjz50sGbpnFp2oab30cZ5MNYsaD3PtaBRVsUWuLiglI/edit?usp=sharing
[2]. http://research.google.com/archive/large_deep_networks_nips2012.html
[3]. https://github.com/dmlc/parameter_server

> Parameter Server for large scale MLP
> ------------------------------------
>
>                 Key: HAMA-961
>                 URL: https://issues.apache.org/jira/browse/HAMA-961
>             Project: Hama
>          Issue Type: Improvement
>          Components: machine learning
>    Affects Versions: 0.7.0
>            Reporter: Edward J. Yoon
>            Assignee: Edward J. Yoon
>             Fix For: 0.8.0
>
>
> I've recently started to review the MLP source codes closely, and I'm 
> thinking about some improvement and API refactoring e.g., APIs for 
> user-defined neuron and synapse models, data structure, ..., etc.
> This issue is one of them, and related to train large models. I'm considering 
> distributed parameter server (http://parameterserver.org) for managing 
> parameters. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to