Hi MXNet community,

>From the first MKLDNN backend integrated in release 1.2,  the community is 
>continuously improving the quality and performance of MKLDNN CPU backend.
Nowadays, the MKLDNN backend is widely used for the inference, especially for 
INT8 inference,  and we got lots of very positive feedbacks from MXNet users. 

Achieved milestones as below:

- MKLDNN integrated into Apache MXNet from release 1.2, Feb, 2018 [1]
- MKLDNN backend as default CPU backend from source building, Jan, 2019 [2]
- MKLDNN subgraph optimization as default for the inference, Jul, 2019 [3]
- MKLDNN major version upgrade in release 1.6, Oct, 2019 [4] 
 
To make more successful and technical leadership for Apache MXNet in the 
industry, I propose to make MKLDNN as default CPU backend in all binary 
distribution from the next release.
The new milestone includes:

- Static link MKLDNN library in the binary avoiding the mismatch version in the 
runtime [5]
- Make nightly build with MKLDNN default from master pre 1.7 release
- Binary distribution with MKLDNN default from 1.7 release.

What will be changed:

- mxnet and mxnet-cuXX binary will be built with MKLDNN=1
- mxnet-mkl and mxnet-cuXXmkl will be not changed in the minor release (1.x) 
and plan to remove in next major release (2.0)

Suggestions and comments are highly appreciated.

Thanks,

--Patric


[1] https://github.com/apache/incubator-mxnet/pull/9677
[2] 
https://lists.apache.org/thread.html/bfeae6ee46374112eb4dff1470c262959101e4bffb19930926963535@%3Cdev.mxnet.apache.org%3E
[3] https://github.com/apache/incubator-mxnet/pull/15518
[4] 
https://lists.apache.org/thread.html/f46ab920f18795496eafe713e6e9e561c684e06189085cec17b401dc@%3Cdev.mxnet.apache.org%3E
[5] https://github.com/apache/incubator-mxnet/pull/16731

Reply via email to