Thanks, that's a nice summary. Great job and good to know the
progress. I think we can do some exciting stuff in terms of parsing
the Python AST and converting to a computational graph. Maybe we could
brainstorm on that further on the linked ticket.

On Wed, May 22, 2019 at 12:12 AM Jun Wu <wujun....@gmail.com> wrote:
>
> Dear Community,
>
> A few months ago, we submitted this RFC
> <https://github.com/apache/incubator-mxnet/issues/14253> proposing
> introducing NumPy-compatible coding experience into MXNet. As it has been
> some time since the proposal, we would like to share the progress with the
> community and listen to feedbacks and suggestions to enhance technical
> implementation as well as the way the project is operated.
>
> We set our first milestone by tackling the problem of MXNet not supporting
> scalar and zero-size tensors. Last month, we submitted the PR
> <https://github.com/apache/incubator-mxnet/pull/14661> providing the
> infrastructure to support those two types of tensors in MXNet. This work
> has affected almost every file and all language bindings in MXNet codebase.
> It would be impossible to provide a complete solution hadn't there any
> contributions from many MXNet developers across different organizations.
>
> With the infrastructure of supporting scalar and zero-size tensors, we are
> currently working on implementing NumPy operators in MXNet. We created a
> list of operators <https://github.com/apache/incubator-mxnet/issues/14327>
> to be implemented from the D2L book <http://www.d2l.ai/>, and hope that we
> will be able to provide full NumPy operator coverage for the book by the
> end of next month.
>
> In the future, we plan to provide NumPy operator support for GluonCV
> <https://github.com/dmlc/gluon-cv> and GluonNLP
> <https://github.com/dmlc/gluon-nlp>. We also intend to explore the
> opportunities of extending our work to support the libraries that heavily
> depend on NumPy, not only from the deep learning world, but also a broader
> data science community, where the techniques employed by deep learning,
> such as auto differentiation, symbolic programming, GPU computing, and so
> forth can be beneficial.
>
> Thank you very much for your time to read this email and care about our
> efforts on making MXNet a super user-friendly deep learning framework. We
> look forward to your comments, suggestions and contributions for this
> project.
>
> Best,
> Developers of MXNet NumPy Project
>
> References
> [1] Development branch: https://github.com/apache/incubator-mxnet/tree/numpy
> [2] PR for supporting scalar and zero-size tensors:
> https://github.com/apache/incubator-mxnet/pull/14661
> [3] First batch of NumPy operators to be implemented:
> https://github.com/apache/incubator-mxnet/issues/14327
> [4] The D2L book: https://github.com/d2l-ai/d2l-en
> [5] GluonCV: https://github.com/dmlc/gluon-cv
> [6] GluonNLP: https://github.com/dmlc/gluon-nlp

Reply via email to