Re: What's everyone working on?

2017-09-27 Thread Jun Wu
I had been working on the sparse tensor project with Haibin. After it was wrapped up for the first stage, I started my work on the quantization project (INT-8 inference). The benefits of using quantized models for inference include much higher inference throughput than FP32 model with acceptable ac

Re: What's everyone working on?

2017-10-02 Thread Jun Wu
gt; wrote: > đź‘Ź > > On Mon, Oct 2, 2017 at 8:02 PM Seb Kiureghian wrote: > > > It would be awesome if MXNet were the first DL framework to support > Nvidia > > Volta. What do you all think about cutting a v0.12 release once that > > integration is ready? > &

Re: update build instructions

2017-11-02 Thread Jun Wu
Many of developers are using Makefile. Getting rid of it doesn't sound user friendly. On Thu, Nov 2, 2017 at 8:53 AM, Chris Olivier wrote: > I don't know why we don't get rid of the Makefile altogether and use cmake. > It's a pain to manage both of them independently. Does anyone know why we >

Re: Intel Plan for the contribution to MXNET

2018-01-31 Thread Jun Wu
Hi Patric, Thanks for the contribution. It’s great to see actions on developing INT8 inference for CPU! I have a few questions and hope to have your answers. 1. When you said your work is aligned with PR9552 , did you mean you used quantization

Re: Intel Plan for the contribution to MXNET

2018-01-31 Thread Jun Wu
Great. Let's coordinate to keep our efforts aligned. On Wed, Jan 31, 2018 at 9:51 PM, Zhao, Patric wrote: > Thanks, Jun, please see my comments inline. > > > > Wenting and Jin will follow up the tasks in the PR. > > > > *From:* Jun Wu [mailto:wujun@gmail.com

Design Proposal: Logging MXNet Data for Visualization in TensorBoard

2018-02-09 Thread Jun Wu
Hi All, We have drafted a design proposal of making a data logger in MXNet Python package for visualizing MXNet data in TensorFlow's TensorBoard. Please feel free to let us know your comments, suggestions, and corrections on the proposal. Thank you very much for your time and consideration. https

Re: Design Proposal: Logging MXNet Data for Visualization in TensorBoard

2018-02-09 Thread Jun Wu
Thank you for your words, Chris! On Fri, Feb 9, 2018 at 8:27 PM, Chris Olivier wrote: > Fantastic document, Jun! > > > On Fri, Feb 9, 2018 at 2:58 PM Jun Wu wrote: > > > Hi All, > > > > We have drafted a design proposal of making a data logger in MXNet Python

Re: Design Proposal: Logging MXNet Data for Visualization in TensorBoard

2018-02-09 Thread Jun Wu
eb 9, 2018 at 5:57 PM, Jun Wu wrote: > > > Hi All, > > > > We have drafted a design proposal of making a data logger in MXNet Python > > package for visualizing MXNet data in TensorFlow's TensorBoard. Please > feel > > free to let us know your comments, suggest

Re: Help to subscribe MXNet Slack channel

2018-03-05 Thread Jun Wu
Hi Yuji, Welcome to the MXNet community! We have been working on delivering a complete logging solution in MXNet for users to employ the TensorFlow's TensorBoard for visualization. You can find our work in the following links: - Design Doc: https://cwiki.apache.org/confluence/display/MXNET/Logg

Re: Help to subscribe MXNet Slack channel

2018-03-06 Thread Jun Wu
on ONNX is that you would like to use > "torch.onnx._optimize_trace" method? > > Thanks, > Yuji > > 2018-03-06 15:36 GMT+09:00 Jun Wu : > >> Hi Yuji, >> >> Welcome to the MXNet community! >> >> We have been working on delivering a complete

Re: [VOTE] Release Apache MXNet (incubating) version 1.2.0.RC0

2018-04-21 Thread Jun Wu
+1 Compiled from source. Ran the model quantization example. Both quantized model generation and inference can run successfully. On Fri, Apr 20, 2018 at 5:14 PM, Indhu wrote: > +1 > > Compiled from source on P3 instance. Tested the SSD example and some Gluon > examples. > > On Wed, Apr 18, 2018

Re: [VOTE] Release Apache MXNet(incubating) version 1.2.0.RC2

2018-05-04 Thread Jun Wu
+1 I built from source and ran all the model quantization examples successfully. On Fri, May 4, 2018 at 3:05 PM, Anirudh wrote: > Hi Pedro, Haibin, Indhu, > > Thank you for your inputs on the release. I ran the test: > `test_module.py:test_forward_reshape` for 250k times with different seeds. >

Re: [QUICK VOTE] Release Apache MXNet(incubating) version 1.2.0.RC3

2018-05-15 Thread Jun Wu
+1 On Tue, May 15, 2018 at 9:02 AM, Haibin Lin wrote: > +1, same reason as Indu's. > > On Tue, May 15, 2018 at 12:32 AM, Indhu wrote: > > > +1 given this is the same as previous RC with couple of markdowns > removed. > > > > > > On Mon, May 14, 2018, 11:51 PM Anirudh wrote: > > > > > Hi all, >

Re: Vote to stop using JIRA

2018-06-08 Thread Jun Wu
+1 I have used Jira for a few years. At the time, it had much richer customized features, at the cost of raising a specialized team developing Jira plugins, than what the Jira system has for MXNet. Even so, I'm still not a big fan of it. The learning curve is quite steep to make the personal dashb

Re: The operator check for Scala Package

2018-06-20 Thread Jun Wu
I don't think it's reasonable to fail on purpose in Scala when changing documentation or adding operators in C++. At least, it should follow the same behavior as we have for Python binding. On Wed, Jun 20, 2018 at 10:13 AM Qing Lan wrote: > Hi Haibin, > > The operator change is any changes on th

Re: The operator check for Scala Package

2018-06-20 Thread Jun Wu
We should reach an agreement on the responsibility/ownership before moving forward. 1. Who will take the ownership of fixing Scala build failure if an operator is changed/added in C++? 2. What is the expected turn around time of fixing the scala build failure. 3. What if we are short of resources

Re: [VOTE] Release MXNet version 1.2.1.RC1

2018-07-12 Thread Jun Wu
+1 Built from source and ran the gpu unit tests. On Thu, Jul 12, 2018 at 8:01 AM sandeep krishnamurthy < sandeep.krishn...@gmail.com> wrote: > +1 > > Built from source. Tested on CPU and GPU with Keras-MXNet (ResNet and LSTM > examples), works as expected. > > Best, > Sandeep > > On Thu, Jul 12,

Re: Include MKLDNN into default mxnet pip package

2018-10-17 Thread Jun Wu
If my understanding is correct about the context, it should be acknowledged that the significant performance improvement comes from the Intel MKLDNN team's contribution in this PR: https://github.com/apache/incubator-mxnet/pull/12530. On Wed, Oct 17, 2018 at 3:12 PM kellen sunderland < kellen.sund

Re: [ANNOUNCE] MKLDNN becomes the default CPU backend in Apache/MXNet master branch

2019-01-11 Thread Jun Wu
Great work on boosting the MXNet/MKLDNN performance significantly! On Fri, Jan 11, 2019 at 7:08 PM Li, Mu wrote: > Awesome job! That’s a great benefit to CPU users > > Best > Mu > > > On Jan 11, 2019, at 6:59 PM, Zhao, Patric wrote: > > > > Dear all, > > > > I am pleased to announce that the MK

[RFC] Introducing NumPy-compatible coding experience in MXNet

2019-02-25 Thread Jun Wu
Dear Community, We have published a post here requesting for comments on our proposal of improving MXNet usability. We'd like to hear the thoughts and suggestions from the community and welcome any form of contribution to make MXNet's usabil

Implementing zero-dim and zero-size tensors in MXNet and its impact on your codebases

2019-04-10 Thread Jun Wu
Dear Community, A while ago, we sent out an RFC discussing the initiative introducing NumPy compatibility into MXNet. As the first outcome of this initiative, we submitted the PR providi

Re: Implementing zero-dim and zero-size tensors in MXNet and its impact on your codebases

2019-04-11 Thread Jun Wu
> wrote: > > > > > > > Really nice improvement over MXNet's usability! I suggest that we > could > > > > make numpy-compatible behavior default in 2.0. > > > > > > > > On Wed, Apr 10, 2019 at 11:34 PM Jun Wu wrote: > > > &g

Re: Implementing zero-dim and zero-size tensors in MXNet and its impact on your codebases

2019-04-11 Thread Jun Wu
I have observed this has been an undocumented > > guideline > > >> to not break C APIs (example: > > >> > > > https://github.com/apache/incubator-mxnet/pull/11429#discussion_r199564999 > > >> ). > > >> Although the C APIs are supposed to

Re: [QUESTION] mxnet/Tuple vs nnvm/Tuple

2019-04-16 Thread Jun Wu
include/mxnet/tuple.h was first copied from nnvm in this PR so that we can make changes on it to support zero-dim and zero-size tensors without affecting TVM project. That PR has changed most of the places where nnvm::Tuple and nnvm::TShape wer

[Announcement] New Committer - Hao Jin

2019-04-29 Thread Jun Wu
Please join me in welcoming Hao Jin (https://github.com/haojin2) from AWS as a new committer. Hao has designed and implemented many sophisticated algorithms for tensor operations. His work has greatly expanded the coverage of MXNet operator inventory and enhanced the performance of many operators

[Announcement] New Committer - Zhennan Qin

2019-04-29 Thread Jun Wu
Please join me in welcoming Zhennan Qin (https://github.com/ZhennanQin) from Intel as a new committer. Zhennan is the main author of accelerating MXNet/MKLDNN inference through operator fusion and model quantization. His work has placed MXNet in an advantageous place for inference workloads on Int

Report of MXNet NumPy Project Status

2019-05-22 Thread Jun Wu
Dear Community, A few months ago, we submitted this RFC proposing introducing NumPy-compatible coding experience into MXNet. As it has been some time since the proposal, we would like to share the progress with the community and listen to fe

Re: Making new operators and AMP lists

2019-05-28 Thread Jun Wu
Thanks for initiating the discussion on dev. I understand the dilemma from designing AMP for making the feature usable and functional as well as for not breaking other developer experience. However, IMO, this is not about WHEN we should let other developers know they have made a mistake by not add

Re: Making new operators and AMP lists

2019-05-28 Thread Jun Wu
Hi Marco, As it has been stated by others, the concern is not about how much cost it results in, but about whether it should be there from the beginning. Just imagine how you would explain to a MKLDNN developer who just added an operator for CPU computing that he/she must put the operator name in

[Call for Contributions] NumPy operators in MXNet

2019-09-16 Thread Jun Wu
Dear Community, As part of the scope towards MXNet 2.0 (see the RFC ), we have been putting a lot of efforts on implementing NumPy operators in MXNet. Over the past three months, we have implemented about 140 NumPy operators, of which 80 have

Re: [VOTE] Release Apache MXNet (incubating) version 1.6.0.rc1

2020-01-10 Thread Jun Wu
+1 (binding) Built from source. Ran all the GPU tests and test_numpy*.py cpu tests without problems. On Fri, Jan 10, 2020 at 9:43 PM Skalicky, Sam wrote: > We can enable building nightlys for feature branches too. > > Sam > > > On Jan 10, 2020, at 7:48 PM, Lin Yuan wrote: > > > > We can relea