Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Sheng Zha
My understanding is that DJL depends on MXNet, so if you want to bring JNA from 
DJL into MXNet, it will create circular dependency as a 3rdparty module. In 
terms of stability, I was referring to the development of code base rather than 
the performance.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620815186

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Lanking
There is no code for JNA, everything is generated. It ensure the general 
standard and minimum layer in C to avoid error and mistakes.

About JNA, you can find more information here: 
[jnarator](https://github.com/awslabs/djl/tree/master/mxnet/jnarator). We build 
an entire project for the jna generation pipeline. All we need is a header file 
from MXNet to build everything. The dependency required by the gradle build is 
minimum, as you can fine in 
[here](https://github.com/awslabs/djl/blob/master/mxnet/jnarator/build.gradle#L5-L15).
  

To address the concern of stability, we tested DJL MXNet with 100 hour 
inference run on server and it remains stable. Training experience is also 
smooth, multi-gpu run 48 hours is also stable. The performance is very close to 
python with large models and may bring huge boost if model is smaller or equals 
to "squeezenet level".

@frankfliu can bring more information about the JNA layer.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620811482

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Sheng Zha
@lanking520 would it create circular dependency? and how stable is the JNA and 
what changes are expected? it would be great if you could share a pointer on 
the JNA code to help clarify these concerns.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620803313

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 JVM Language development (#17783)

2020-04-28 Thread Lanking
@szha For option 4, I would recommend to consume the JNA layer as a submodule 
from DJL. I am not sure if this is recommendation serves as "add a dependency 
in mxnet".

There are two key reason that support for that:

1. DJL moves really fast and we can quickly change the JNA layer whenever in 
need. Comparing to the merging speed in MXNet.

2. Consume as a submodule means MXNet community don't have to take care much on 
the maintainance. DJL team will regularly provide Jar for MXNet user to consume.

We can also contribute code back in MXNet repo, since it is open source. But we 
may still keep a copy in our repo for fast iteration. It may cause diverged 
version on JNA layer.

Overally speaking, my recommendation on option 4 leads towards a direction to 
consume DJL JNA as a submodule.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620750376