Subject: Re: Data and Model Parallelism in MLPC
Hi,
I went through the code for implementation of MLPC and couldn't understand why
stacking/unstacking of the input data has been done. The description says "
Block size for stacking input data in matrices to speed up the computation.
r way of model parallelism would be to represent the network as the
>> graph and use GraphX to write forward and back propagation. However, this
>> option does not seem very practical to me.
>>
>>
>>
>> Best regards, Alexander
>>
>>
>>
>> *
Hi Disha,
Multilayer perceptron classifier in Spark implements data parallelism.
Best regards, Alexander
From: Disha Shrivastava [mailto:dishu@gmail.com]
Sent: Tuesday, December 08, 2015 12:43 AM
To: dev@spark.apache.org; Ulanov, Alexander
Subject: Data and Model Parallelism in MLPC
Hi,
I
Hi Alexander,
Thanks for your response. Can you suggest ways to incorporate Model
Parallelism in MPLC? I am trying to do the same in Spark. I got hold of
your post
http://apache-spark-developers-list.1001551.n3.nabble.com/Model-parallelism-with-RDD-td13141.html
where you have divided the weight
forward and back
propagation. However, this option does not seem very practical to me.
Best regards, Alexander
From: Disha Shrivastava [mailto:dishu@gmail.com]
Sent: Tuesday, December 08, 2015 11:19 AM
To: Ulanov, Alexander
Cc: dev@spark.apache.org
Subject: Re: Data and Model Parallelism in MLPC