Hi Alexander,

Thanks for your response. Can you suggest ways to incorporate Model
Parallelism in MPLC? I am trying to do the same in Spark. I got hold of
your post
http://apache-spark-developers-list.1001551.n3.nabble.com/Model-parallelism-with-RDD-td13141.html
where you have divided the weight matrix into different worker machines. I
have two basic questions in this regard:

1. How to actually visualize/analyze and control how nodes of the neural
network/ weights are divided across different workers?

2. Is there any alternate way to achieve model parallelism for MPLC in
Spark? I believe we need to have some kind of synchronization and control
for the updation of weights shared across different workers during
backpropagation.

Looking forward for your views on this.

Thanks and Regards,
Disha

On Wed, Dec 9, 2015 at 12:36 AM, Ulanov, Alexander <alexander.ula...@hpe.com
> wrote:

> Hi Disha,
>
>
>
> Multilayer perceptron classifier in Spark implements data parallelism.
>
>
>
> Best regards, Alexander
>
>
>
> *From:* Disha Shrivastava [mailto:dishu....@gmail.com]
> *Sent:* Tuesday, December 08, 2015 12:43 AM
> *To:* dev@spark.apache.org; Ulanov, Alexander
> *Subject:* Data and Model Parallelism in MLPC
>
>
>
> Hi,
>
> I would like to know if the implementation of MLPC in the latest released
> version of Spark ( 1.5.2 ) implements model parallelism and data
> parallelism as done in the DistBelief model implemented by Google
> http://static.googleusercontent.com/media/research.google.com/hi//archive/large_deep_networks_nips2012.pdf
> <http://static.googleusercontent.com/media/research.google.com/hi/archive/large_deep_networks_nips2012.pdf>
>
>
> Thanks And Regards,
>
> Disha
>

Reply via email to