There is no code for JNA, everything is generated. It ensure the general 
standard and minimum layer in C to avoid error and mistakes.

About JNA, you can find more information here: 
[jnarator](https://github.com/awslabs/djl/tree/master/mxnet/jnarator). We build 
an entire project for the jna generation pipeline. All we need is a header file 
from MXNet to build everything. The dependency required by the gradle build is 
minimum, as you can fine in 
[here](https://github.com/awslabs/djl/blob/master/mxnet/jnarator/build.gradle#L5-L15).
  

To address the concern of stability, we tested DJL MXNet with 100 hour 
inference run on server and it remains stable. Training experience is also 
smooth, multi-gpu run 48 hours is also stable. The performance is very close to 
python with large models and may bring huge boost if model is smaller or equals 
to "squeezenet level".

@frankfliu can bring more information about the JNA layer.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-620811482

Reply via email to