Hi all,

I'm happy to mentor things related to Mocha.jl 
(https://github.com/pluskid/Mocha.jl), deep learning library for Julia. 
There are several TODOs on my list but I had difficulty finding free time 
to do. You are also free to propose anything else that is related:

1. Visualization of the networks (e.g. produce a dot file that could be 
rendered by GraphViz to visualize the network nicely)
2. Provide an easy interface to do small scale experiments (e.g. define a 
model by giving something like [(512,:relu), (512,:relu), 10], and being 
able to train or predict with one function call without worrying about all 
the details of layer definition and solver, coffeebreaks, etc.)
3. Implement Recurrent Neural Networks, LSTM

Best,
Chiyuan

On Thursday, May 28, 2015 at 9:24:37 AM UTC-4, Viral Shah wrote:
>
> You should certainly write to pluskid - Mocha's author.
>
> On Thu, May 28, 2015 at 9:35 AM, Siva Prasad Varma <sivap...@gmail.com 
> <javascript:>> wrote:
>
>> I am interested in implementing Neural Network visualization for Mocha 
>> along the lines of https://github.com/ajtulloch/dnngraph or implementing 
>> some algorithms in the IterativeSolvers.jl roadmap depending on whether I 
>> will be able to find a mentor.
>>
>> Thanks,
>> Siva.
>>
>>
>> On Thursday, May 28, 2015 at 9:13:41 AM UTC+5:30, Jiahao Chen wrote:
>>>
>>> I'd be happy to mentor someone working on parallel linear algebra. The 
>>> simplest thing to do that will have very high impact is to implement high 
>>> performance iterative (Golub-Kahan-Lanczos) SVD, similar to what is 
>>> implemented in PROPACK. I'm also interested in a randomized SVD version 
>>> similar to what is described in Halko, Martinsson and Tropp, 
>>> doi:10.1137/090771806.
>>>
>>> I'm sure there are plenty of ODE projects around, but I would like to 
>>> see someone take up the implementation of geometric integrators in ODE.jl.
>>>
>>> Thanks,
>>>
>>> Jiahao Chen
>>> Research Scientist
>>> MIT CSAIL
>>>  
>>
>
>
> -- 
> -viral
>  

Reply via email to