Hi, instead of JNA, I would be happy to provide bindings for the C API and 
maintain packages based on the JavaCPP Presets here:
https://github.com/bytedeco/javacpp-presets/tree/master/mxnet
JavaCPP adds no overhead, unlike JNA, and is often faster than manually written 
JNI. Plus JavaCPP provides more tools than JNA to automate the process of 
parsing header files as well as packaging native libraries in JAR files. I have 
been maintaining modules for TensorFlow based on JavaCPP, and we actually got a 
boost in performance when compared to the original JNI code:
https://github.com/tensorflow/java/pull/18#issuecomment-579600568
I would be able to do the same for MXNet and maintain the result in a 
repository of your choice. Let me know if this sounds interesting! BTW, the 
developers of DJL also seem opened to switch from JNA to JavaCPP even though it 
is not a huge priority. Still, standardizing how native bindings are created 
and loaded with other libraries for which JavaCPP is pretty much already the 
standard (such as OpenCV, TensorFlow, CUDA, FFmpeg, LLVM, Tesseract) could go a 
long way in alleviating concerns of stability.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-662994965

Reply via email to