Re: [Launch Announcement] Keras 2 with Apache MXNet (incubating) backend

2018-05-22 Thread Indhu
This is great. Thanks to everybody involved.


On Tue, May 22, 2018, 9:15 AM sandeep krishnamurthy  wrote:

> Hello MXNet community,
>
> Keras users can now use the high-performance MXNet deep learning engine for
> the distributed training of convolutional neural networks (CNNs) and
> recurrent neural networks (RNNs). With an update of a few lines of code,
> Keras developers can increase training speed by using MXNet's multi-GPU
> distributed training capabilities. Saving an MXNet model is another
> valuable feature of the release. You can design in Keras, train with
> Keras-MXNet, and run inference in production, at-scale with MXNet.
>
> From our initial benchmarks, CNNs with Keras-MXNet is up to 3X faster on
> GPUs compared to the default backend. See the benchmark module
>  for
> more details.
>
> RNN support in this release is experimental with few known
> issues/unsupported functionalities. See using RNN with Keras-MXNet
> limitations and workarounds doc
> <
> https://github.com/awslabs/keras-apache-mxnet/blob/master/docs/mxnet_backend/using_rnn_with_mxnet_backend.md
> >
> for more details.
>
> See Release Notes
>  for
> more details on unsupported operators and known issues. We will continue
> our efforts in the future releases to close the gaps.
>
> Thank you for all the contributors - Lai Wei ,
> Karan
> Jariwala , Jiajie Chen
> , Kalyanee Chendke <
> https://github.com/kalyc>,
> Junyuan Xie 
>
> We welcome your contributions -
> https://github.com/awslabs/keras-apache-mxnet. Here is the issue with the
> list of operators to be implemented. Do check it out and create a PR -
> https://github.com/awslabs/keras-apache-mxnet/issues/18
>
> Best,
> Sandeep
>


[Launch Announcement] Keras 2 with Apache MXNet (incubating) backend

2018-05-22 Thread sandeep krishnamurthy
Hello MXNet community,

Keras users can now use the high-performance MXNet deep learning engine for
the distributed training of convolutional neural networks (CNNs) and
recurrent neural networks (RNNs). With an update of a few lines of code,
Keras developers can increase training speed by using MXNet's multi-GPU
distributed training capabilities. Saving an MXNet model is another
valuable feature of the release. You can design in Keras, train with
Keras-MXNet, and run inference in production, at-scale with MXNet.

>From our initial benchmarks, CNNs with Keras-MXNet is up to 3X faster on
GPUs compared to the default backend. See the benchmark module
 for
more details.

RNN support in this release is experimental with few known
issues/unsupported functionalities. See using RNN with Keras-MXNet
limitations and workarounds doc

for more details.

See Release Notes
 for
more details on unsupported operators and known issues. We will continue
our efforts in the future releases to close the gaps.

Thank you for all the contributors - Lai Wei , Karan
Jariwala , Jiajie Chen
, Kalyanee Chendke ,
Junyuan Xie 

We welcome your contributions -
https://github.com/awslabs/keras-apache-mxnet. Here is the issue with the
list of operators to be implemented. Do check it out and create a PR -
https://github.com/awslabs/keras-apache-mxnet/issues/18

Best,
Sandeep