How does this compare to Mocha.jl?

Il giorno lunedì 26 ottobre 2015 04:27:31 UTC+1, Chiyuan Zhang ha scritto:
>
> MXNet.jl <https://github.com/dmlc/MXNet.jl> is the dmlc/mxnet 
> <https://github.com/dmlc/mxnet> Julia <http://julialang.org/> package. 
> MXNet.jl brings flexible and efficient GPU computing and state-of-art deep 
> learning to Julia. Some highlight of features include:
>
>    - Efficient tensor/matrix computation across multiple devices, 
>    including multiple CPUs, GPUs and distributed server nodes.
>    - Flexible symbolic manipulation to composite and construct 
>    state-of-the-art deep learning models.
>
> Here is an exmple of how training a simple 3-layer MLP on MNIST looks like:
>
> using MXNet
>
> mlp = @mx.chain mx.Variable(:data)             =>
>   mx.FullyConnected(name=:fc1, num_hidden=128) =>
>   mx.Activation(name=:relu1, act_type=:relu)   =>
>   mx.FullyConnected(name=:fc2, num_hidden=64)  =>
>   mx.Activation(name=:relu2, act_type=:relu)   =>
>   mx.FullyConnected(name=:fc3, num_hidden=10)  =>
>   mx.Softmax(name=:softmax)
> # data provider
> batch_size = 100include(joinpath(Pkg.dir("MXNet"), 
> "/examples/mnist/mnist-data.jl"))
> train_provider, eval_provider = get_mnist_providers(batch_size)
> # setup model
> model = mx.FeedForward(mlp, context=mx.cpu())
> # optimizer
> optimizer = mx.SGD(lr=0.1, momentum=0.9, weight_decay=0.00001)
> # fit parameters
> mx.fit(model, optimizer, train_provider, n_epoch=20, eval_data=eval_provider)
>
> For more details, please refer to the document 
> <http://mxnetjl.readthedocs.org/> and examples 
> <https://github.com/dmlc/MXNet.jl/blob/master/examples>.
>
>
> Enjoy!
>
> - pluskid
>

Reply via email to