Hi Julio,

Can you please cite references based on the distributed implementation?

On Tue, Nov 3, 2015 at 8:52 PM, Julio Antonio Soto de Vicente <
ju...@esbet.es> wrote:

> Hi,
> Is my understanding that little research has been done yet on distributed
> computation (without access to shared memory) in RNN. I also look forward
> to contributing in this respect.
>
> El 03/11/2015, a las 16:00, Disha Shrivastava <dishu....@gmail.com>
> escribió:
>
> I would love to work on this and ask for ideas on how it can be done or
> can suggest some papers as starting point. Also, I wanted to know if Spark
> would be an ideal platform to have a distributive implementation for
> RNN/LSTM
>
> On Mon, Nov 2, 2015 at 10:52 AM, Sasaki Kai <lewua...@me.com> wrote:
>
>> Hi, Disha
>>
>> There seems to be no JIRA on RNN/LSTM directly. But there were several
>> tickets about other type of networks regarding deep learning.
>>
>> Stacked Auto Encoder
>> https://issues.apache.org/jira/browse/SPARK-2623
>> CNN
>> https://issues.apache.org/jira/browse/SPARK-9129
>> https://issues.apache.org/jira/browse/SPARK-9273
>>
>> Roadmap of MLlib deep learning
>> https://issues.apache.org/jira/browse/SPARK-5575
>>
>> I think it may be good to join the discussion on SPARK-5575.
>> Best
>>
>> Kai Sasaki
>>
>>
>> On Nov 2, 2015, at 1:59 PM, Disha Shrivastava <dishu....@gmail.com>
>> wrote:
>>
>> Hi,
>>
>> I wanted to know if someone is working on implementing RNN/LSTM in Spark
>> or has already done. I am also willing to contribute to it and get some
>> guidance on how to go about it.
>>
>> Thanks and Regards
>> Disha
>> Masters Student, IIT Delhi
>>
>>
>>
>

Reply via email to