well, I'm not going to talk about technical stuffs.
You can find some design concepts on doc or wiki.
(https://mxnet.incubator.apache.org/versions/master/architecture/index.html)

For me, working on MXNet is a rare chance to verify my ideas of
a machine learning framework.
During implementing MXNet Julia package, I can explicitly compare the
experience of MXNet with Flux's
...and than start to complaining about them. :p
I think a way to moving forward is comparison.
So that's why I said I want to increase the diversity of DL tools in Julia.

I like the spirit of portability in MXNet community.
We welcomed all of language packages and open-minded.
Although some of languages might be considered not popular in ML/DL,
this community still keep polishing them day in day out.
Yeah, someone has to try it, compare and gain experience from this
process regardless of how the language has been evaluated in ML.
The experience is valuable.
(e.g. I think lack of function overloading is a disadvantage
 of Python; the file-based namespace does help for maintainability
 in Python.
 After I did some works in Julia, I can clearly point out pros and cons.)

From a long-term view... maybe twenty years after,
none of the languages we are using now will be popular.
But I believe the meta-rules which extracted from experiences are still applied.

So.. why not have a Rust lib? maybe Rust's macro can do something crazy, maybe.
e.g. Julia package shows a more elegant way to stack a network than Python,
thanks to metaprogramming.

  mlp = @mx.chain mx.Variable(:data)             =>
    mx.FullyConnected(name=:fc1, num_hidden=128) =>
    mx.Activation(name=:relu1, act_type=:relu)   =>
    mx.FullyConnected(name=:fc2, num_hidden=64)  =>
    mx.Activation(name=:relu2, act_type=:relu)   =>
    mx.FullyConnected(name=:fc3, num_hidden=10)  =>
    mx.SoftmaxOutput(name=:softmax)


Wondering where that leaves MxNet...

Actually, I don't case about this issue.
We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at.


Iblis Lin
林峻頤

On 2/11/19 12:04 PM, Zach Boldyga wrote:
Those are compelling points! There's also another more recent follow-up
from the Julia team:https://julialang.org/blog/2018/12/ml-language-compiler
.

It seems that Julia will likely have it's place in ML regardless of how
other tools progress; the latest offerings from Julia/Flux are really
compelling.

Wondering where that leaves MxNet...

Zach Boldyga
Scalabull  |  Founder
1 (866) 846-8771 x 101

Reply via email to