Re: improving API docs and tutorials user experience

2018-09-20 Thread Afrooze, Sina
This is not a fully baked idea, so bear with me. Instead of using regex, what 
if we use search techniques to locate the page user wants to look at. If I'm 
currently looking at mxnet.ndarray.convolution operator in master, take me to 
the same operator API in 1.2.1, regardless of whether it is in a similar page 
or whether we decided to re-organize API pages and create a page just for 
convolution. Likewise for tutorials (i.e. find the tutorial that best matches 
the content I'm looking at, even if it's been renamed or moved around). I feel 
like it can be a great internship project. - Sina

On 9/18/18, 5:08 PM, "Aaron Markham"  wrote:

Hello dev list!

I've been doing a bit with .htaccess redirects on the site to get us things
like custom 404s and redirecting google searches that come in on outdated
material. That's working pretty well because the redirects are simple.

I need help though. I've written a short proposal on how I think the UX for
API docs and tutorials should work. I'm not trying a major jump here. This
is a small amount of change for a better experience.
For example, right now, if you're browsing API docs for 1.3.0 and want to
see the master version, you would hit the dropdown and select master. Then
you get taken to the home page for master and have to go find the document
again. (It's always done this.) It should stay on that document, but give
the latest version. Or for tutorials, if you request a really old, likely
broken tutorial because it happens to show up in google search results, the
site should kindly escort you to the latest, tested tutorial.

mod_rewrite and .htaccess can do this, but it requires regex skills that I
lack. I've tried hundreds of variations now, and would like some guidance.

Here's the proposal and my notes:

https://cwiki.apache.org/confluence/display/MXNET/Version+calls+and+redirects+for+Tutorials+and+API+documents

I appreciate your help with this!

Cheers,
Aaron





Re: multiple installation guides?

2018-09-20 Thread Marco de Abreu
https://issues.apache.org/jira/browse/INFRA-17045

Marco de Abreu  schrieb am Do., 20. Sep.
2018, 17:28:

> I'd add a paragraph about requesting a small root cause investigation why
> the web-server diverged from the repository if you don't mind:
>
> 
> Apache Infra,
> Please remove the /test/ directory found at
> https://mxnet.incubator.apache.org/test
> This directory is not found in the
> https://github.com/apache/incubator-mxnet-site repository, yet it has web
> traffic going to this neglected folder.
> Are there any other folders on the web server that might also fit in this
> category? If so, please advise and remove if they appear to be similar
> "test" folders.
>
> We would appreciate it if you could help us root-causing this problem to
> avoid further similar incidents.
> ~~~
>
> I have forwarded the email to Sebastian.
>
> Best regards,
> Marco
>
> On Thu, Sep 20, 2018 at 4:42 PM Aaron Markham 
> wrote:
>
>> Sure...
>> ~~
>> Apache Infra,
>> Please remove the /test/ directory found at
>> https://mxnet.incubator.apache.org/test
>> This directory is not found in the
>> https://github.com/apache/incubator-mxnet-site repository, yet it has web
>> traffic going to this neglected folder.
>> Are there any other folders on the web server that might also fit in this
>> category? If so, please advise and remove if they appear to be similar
>> "test" folders.
>>
>> Thanks,
>> Aaron
>>
>> On Thu, Sep 20, 2018 at 6:00 AM Marco de Abreu
>>  wrote:
>>
>> > Sure. Aaron, can you write something down that can be copy and pasted
>> into
>> > a ticket so that our mentors just have to create it?
>> >
>> > I think we could ask Infra for an investigation about *why* this
>> happened,
>> > considering the fact that it should be an exact mirror, and tools to
>> > self-service these kind of things.
>> >
>> > -Marco
>> >
>> > On Wed, Sep 19, 2018 at 4:11 PM Aaron Markham <
>> aaron.s.mark...@gmail.com>
>> > wrote:
>> >
>> > > It's not on the site repo. Seems like it is only on the Apache infra.
>> Can
>> > > someone request it's removal?
>> > >
>> > > On Tue, Sep 18, 2018, 20:34 Hagay Lupesko  wrote:
>> > >
>> > > > The /test site seems to be something old that should have been
>> removed
>> > a
>> > > > long time ago, it lists versions 0.10 and 0.10.14 :)
>> > > > Maybe Aaron has an idea what needs to be done to remove it...
>> > > >
>> > > > On Fri, Sep 14, 2018 at 4:55 PM Alex Zai  wrote:
>> > > >
>> > > > > Why do we have two sets of installation guides?
>> > > > >
>> > > > > http://mxnet.incubator.apache.org/test/get_started/install.html
>> > > > >
>> > > > >
>> > > >
>> > >
>> >
>> https://mxnet.incubator.apache.org/install/index.html?platform=Linux=Python=CPU
>> > > > >
>> > > > > The /test domain is also not secure. If this is not suppose to be
>> > > > > public we should remove this as it is confusing.
>> > > > >
>> > > >
>> > >
>> >
>>
>


Re: Remove MKLML as dependency

2018-09-20 Thread Chris Olivier
I fixed this issue by adding linkage to openblas in the cmake. It was
already being done in the makefile, so there's no github issue.

On Thu, Sep 20, 2018 at 10:01 AM Lv, Tao A  wrote:

> " MKLML does not have a complete blas library and if you don’t link in
> another blas library like open blas, some functions will blow up (ie some
> of the linalg functions)."
> - Is there any GitHub issue for this problem? Maybe we can take a look.
>
> "I was not aware of MKLML still being required with MKLDNN."
> - Just to clarify, MKL-DNN doesn't require MKLML. For performance, MKL-DNN
> requires the GEMM functions which can be provided by both MKL and MKLML.
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Friday, September 21, 2018 12:07 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> MKLML does not have a complete blas library and if you don’t link in
> another blas library like open blas, some functions will blow up (ie some
> of the linalg functions).
>
> I was not aware of MKLML still being required with MKLDNN. I’ve never
> gotten a definitive answer about this from Da, although I’ve asked a couple
> of times.
>
> What does Da say about all of this?
>
> Unless there’s good reason to the contrary, removing MKLML and requiring
> the larger, strangely licensed standalone MKL for everyone seems a bit
> heavy-handed.
>
> On Thu, Sep 20, 2018 at 7:41 AM Lv, Tao A  wrote:
>
> > Hah, seems it's a little confusing here. I think the "Intel MKL" in
> > the first statement includes both the full MKL and MKLML library. And
> > the "dynamic library" there obviously means the MKLML which is
> > delivered in MKL-DNN repo.
> >
> > MKLML is a subset of full MKL and includes all BLAS functions for both
> > single precision and double precision. From this point of view, I
> > think it can be used as a BLAS library, but cannot be used as full MKL.
> >
> > -tao
> >
> > -Original Message-
> > From: Chris Olivier [mailto:cjolivie...@gmail.com]
> > Sent: Thursday, September 20, 2018 9:36 PM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > thanks for the info. I am still a little confused — your statement
> > said “MKL” and not “MKLML”, so my question is still the same.  Are
> > GEMMS in MKLML or just MKL? I know MKLML doesn’t have a blas library
> > like the main MKL.
> >
> > On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:
> >
> > > Hi Chris, please kindly check the statements here:
> > > https://github.com/intel/mkl-dnn#installation
> > >
> > > " Intel MKL-DNN can take advantage of optimized matrix-matrix
> > > multiplication (GEMM) function from Intel MKL. The dynamic library
> > > with this functionality is included in the repository. "
> > >
> > > " You can choose to build Intel MKL-DNN without binary dependency.
> > > The resulting version will be fully functional, however performance
> > > of certain convolution shapes and sizes and inner product relying on
> > > SGEMM function may be suboptimal."
> > >
> > > -tao
> > >
> > > -Original Message-
> > > From: Chris Olivier [mailto:cjolivie...@gmail.com]
> > > Sent: Thursday, September 20, 2018 11:20 AM
> > > To: dev@mxnet.incubator.apache.org
> > > Subject: Re: Remove MKLML as dependency
> > >
> > > maybe I missed it, but what does MKLML have that mkldnn doesn’t have
> > > that makes it necessary?
> > >
> > > what’s the motivation for removing it?
> > >
> > > On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
> > >
> > > > If you just want to test the performance, I think you need link
> > > > MKL for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for
> > > > better performance.
> > > >
> > > > Here are some ways for you to install full MKL library if you
> > > > don't have
> > > > one:
> > > > 1. Register and download from intel website:
> > > > https://software.intel.com/en-us/mkl
> > > > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > > > a.
> > > >
> > > https://software.intel.com/en-us/articles/installing-intel-free-libs
> > > -a
> > > nd-python-yum-repo
> > > > b. https://software.intel.com/en-us/articles/
> > > > thatinstalling-intel-free-libs-and-python-apt-repo
> > > >  > > > ib
> > > > s-
> > > > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package
> > > > and-python-apt-repo> has
> > > > the runtime and ‘mkl-devel’ includes everything with the headers
> > > > a.
> > > > https://software.intel.com/en-us/articles/installing-the-intel-dis
> > > > tr ib
> > > > ution-for-python-and-intel-performance-libraries-with-pip-and
> > > > 4. conda install: also has mkl and mkl-devel
> > > > a. https://anaconda.org/intel/mkl
> > > > b. https://anaconda.org/intel/mkl-devel
> > > >
> > > > If you want to redistribute MKL with MXNet, you may need take care
> > > > of the license issue. Currently, MKL 

Re: Remove MKLML as dependency

2018-09-20 Thread Naveen Swamy
if MKLDNN is a replacement for MKL and MKLML (which is my understanding)
may you guys should bring in the necessary functions into MKLDNN instead of
letting the users go through this nightmare of a setup.

On Thu, Sep 20, 2018 at 6:01 PM, Lv, Tao A  wrote:

> " MKLML does not have a complete blas library and if you don’t link in
> another blas library like open blas, some functions will blow up (ie some
> of the linalg functions)."
> - Is there any GitHub issue for this problem? Maybe we can take a look.
>
> "I was not aware of MKLML still being required with MKLDNN."
> - Just to clarify, MKL-DNN doesn't require MKLML. For performance, MKL-DNN
> requires the GEMM functions which can be provided by both MKL and MKLML.
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Friday, September 21, 2018 12:07 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> MKLML does not have a complete blas library and if you don’t link in
> another blas library like open blas, some functions will blow up (ie some
> of the linalg functions).
>
> I was not aware of MKLML still being required with MKLDNN. I’ve never
> gotten a definitive answer about this from Da, although I’ve asked a couple
> of times.
>
> What does Da say about all of this?
>
> Unless there’s good reason to the contrary, removing MKLML and requiring
> the larger, strangely licensed standalone MKL for everyone seems a bit
> heavy-handed.
>
> On Thu, Sep 20, 2018 at 7:41 AM Lv, Tao A  wrote:
>
> > Hah, seems it's a little confusing here. I think the "Intel MKL" in
> > the first statement includes both the full MKL and MKLML library. And
> > the "dynamic library" there obviously means the MKLML which is
> > delivered in MKL-DNN repo.
> >
> > MKLML is a subset of full MKL and includes all BLAS functions for both
> > single precision and double precision. From this point of view, I
> > think it can be used as a BLAS library, but cannot be used as full MKL.
> >
> > -tao
> >
> > -Original Message-
> > From: Chris Olivier [mailto:cjolivie...@gmail.com]
> > Sent: Thursday, September 20, 2018 9:36 PM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > thanks for the info. I am still a little confused — your statement
> > said “MKL” and not “MKLML”, so my question is still the same.  Are
> > GEMMS in MKLML or just MKL? I know MKLML doesn’t have a blas library
> > like the main MKL.
> >
> > On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:
> >
> > > Hi Chris, please kindly check the statements here:
> > > https://github.com/intel/mkl-dnn#installation
> > >
> > > " Intel MKL-DNN can take advantage of optimized matrix-matrix
> > > multiplication (GEMM) function from Intel MKL. The dynamic library
> > > with this functionality is included in the repository. "
> > >
> > > " You can choose to build Intel MKL-DNN without binary dependency.
> > > The resulting version will be fully functional, however performance
> > > of certain convolution shapes and sizes and inner product relying on
> > > SGEMM function may be suboptimal."
> > >
> > > -tao
> > >
> > > -Original Message-
> > > From: Chris Olivier [mailto:cjolivie...@gmail.com]
> > > Sent: Thursday, September 20, 2018 11:20 AM
> > > To: dev@mxnet.incubator.apache.org
> > > Subject: Re: Remove MKLML as dependency
> > >
> > > maybe I missed it, but what does MKLML have that mkldnn doesn’t have
> > > that makes it necessary?
> > >
> > > what’s the motivation for removing it?
> > >
> > > On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
> > >
> > > > If you just want to test the performance, I think you need link
> > > > MKL for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for
> > > > better performance.
> > > >
> > > > Here are some ways for you to install full MKL library if you
> > > > don't have
> > > > one:
> > > > 1. Register and download from intel website:
> > > > https://software.intel.com/en-us/mkl
> > > > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > > > a.
> > > >
> > > https://software.intel.com/en-us/articles/installing-intel-free-libs
> > > -a
> > > nd-python-yum-repo
> > > > b. https://software.intel.com/en-us/articles/
> > > > thatinstalling-intel-free-libs-and-python-apt-repo
> > > >  > > > ib
> > > > s-
> > > > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package
> > > > and-python-apt-repo> has
> > > > the runtime and ‘mkl-devel’ includes everything with the headers
> > > > a.
> > > > https://software.intel.com/en-us/articles/installing-the-intel-dis
> > > > tr ib
> > > > ution-for-python-and-intel-performance-libraries-with-pip-and
> > > > 4. conda install: also has mkl and mkl-devel
> > > > a. https://anaconda.org/intel/mkl
> > > > b. https://anaconda.org/intel/mkl-devel
> > > >
> > > > If you want to redistribute MKL with 

RE: Remove MKLML as dependency

2018-09-20 Thread Lv, Tao A
" MKLML does not have a complete blas library and if you don’t link in another 
blas library like open blas, some functions will blow up (ie some of the linalg 
functions)."
- Is there any GitHub issue for this problem? Maybe we can take a look.

"I was not aware of MKLML still being required with MKLDNN."
- Just to clarify, MKL-DNN doesn't require MKLML. For performance, MKL-DNN 
requires the GEMM functions which can be provided by both MKL and MKLML.

-Original Message-
From: Chris Olivier [mailto:cjolivie...@gmail.com] 
Sent: Friday, September 21, 2018 12:07 AM
To: dev@mxnet.incubator.apache.org
Subject: Re: Remove MKLML as dependency

MKLML does not have a complete blas library and if you don’t link in another 
blas library like open blas, some functions will blow up (ie some of the linalg 
functions).

I was not aware of MKLML still being required with MKLDNN. I’ve never gotten a 
definitive answer about this from Da, although I’ve asked a couple of times.

What does Da say about all of this?

Unless there’s good reason to the contrary, removing MKLML and requiring the 
larger, strangely licensed standalone MKL for everyone seems a bit heavy-handed.

On Thu, Sep 20, 2018 at 7:41 AM Lv, Tao A  wrote:

> Hah, seems it's a little confusing here. I think the "Intel MKL" in 
> the first statement includes both the full MKL and MKLML library. And 
> the "dynamic library" there obviously means the MKLML which is 
> delivered in MKL-DNN repo.
>
> MKLML is a subset of full MKL and includes all BLAS functions for both 
> single precision and double precision. From this point of view, I 
> think it can be used as a BLAS library, but cannot be used as full MKL.
>
> -tao
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Thursday, September 20, 2018 9:36 PM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> thanks for the info. I am still a little confused — your statement 
> said “MKL” and not “MKLML”, so my question is still the same.  Are 
> GEMMS in MKLML or just MKL? I know MKLML doesn’t have a blas library 
> like the main MKL.
>
> On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:
>
> > Hi Chris, please kindly check the statements here:
> > https://github.com/intel/mkl-dnn#installation
> >
> > " Intel MKL-DNN can take advantage of optimized matrix-matrix 
> > multiplication (GEMM) function from Intel MKL. The dynamic library 
> > with this functionality is included in the repository. "
> >
> > " You can choose to build Intel MKL-DNN without binary dependency. 
> > The resulting version will be fully functional, however performance 
> > of certain convolution shapes and sizes and inner product relying on 
> > SGEMM function may be suboptimal."
> >
> > -tao
> >
> > -Original Message-
> > From: Chris Olivier [mailto:cjolivie...@gmail.com]
> > Sent: Thursday, September 20, 2018 11:20 AM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > maybe I missed it, but what does MKLML have that mkldnn doesn’t have 
> > that makes it necessary?
> >
> > what’s the motivation for removing it?
> >
> > On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
> >
> > > If you just want to test the performance, I think you need link 
> > > MKL for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for 
> > > better performance.
> > >
> > > Here are some ways for you to install full MKL library if you 
> > > don't have
> > > one:
> > > 1. Register and download from intel website:
> > > https://software.intel.com/en-us/mkl
> > > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > > a.
> > >
> > https://software.intel.com/en-us/articles/installing-intel-free-libs
> > -a
> > nd-python-yum-repo
> > > b. https://software.intel.com/en-us/articles/
> > > thatinstalling-intel-free-libs-and-python-apt-repo
> > >  > > ib
> > > s-
> > > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package 
> > > and-python-apt-repo> has
> > > the runtime and ‘mkl-devel’ includes everything with the headers
> > > a.
> > > https://software.intel.com/en-us/articles/installing-the-intel-dis
> > > tr ib 
> > > ution-for-python-and-intel-performance-libraries-with-pip-and
> > > 4. conda install: also has mkl and mkl-devel
> > > a. https://anaconda.org/intel/mkl
> > > b. https://anaconda.org/intel/mkl-devel
> > >
> > > If you want to redistribute MKL with MXNet, you may need take care 
> > > of the license issue. Currently, MKL is using ISSL ( 
> > > https://software.intel.com/en-us/license/intel-simplified-software
> > > -l
> > > ic
> > > ense
> > > ).
> > >
> > > -Original Message-
> > > From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> > > Sent: Wednesday, September 19, 2018 12:49 PM
> > > To: dev@mxnet.incubator.apache.org
> > > Subject: Re: Remove MKLML as dependency
> > >
> > > 

Some feedback from MXNet Zhihu topic

2018-09-20 Thread Timur Shenkao
There are:
Gluon API
Module API
Some other apis in mxnet
 low-level C / C++ apis

Recently I accidentally found that exist such things like Gluon NLP and
Gluon CV (besides some examples in the very MXNet).
It's unclear whether I can rely on some API or I have to create my own C /
C++ code.

I implement publicly available articles and some other ideas in TF all the
time. But when it comes to MXNet, I am often reluctant because it's
difficult to understand which way to go. It's unclear whether my efforts
will result in some working model or I will get stuck.
Points #5 and #6 are absolutely true.
As for documentation, all projects in their turbulent phase of lifecycle
have outdated docs, it's normal. I say docs are very good (I remember early
Spark & DL4J docs  )



On Thursday, September 20, 2018, Tianqi Chen 
wrote:

> The key complain here is mainly about the clarity of the documents
> themselves. Maybe it is time to focus on a single flavor of API that is
> useful(Gluon) and highlight all the docs around that
>
> Tianqi
>
>
> On Wed, Sep 19, 2018 at 11:04 AM Qing Lan  wrote:
>
> > Hi all,
> >
> > There was a trend topic in
> > Zhihu (a famous Chinese Stackoverflow+Quora) asking about the status of
> > MXNet in 2018 recently. Mu replied the thread and obtained more than 300+
> > `like`.
> > However there are a few concerns addressed in the comments of this
> thread,
> > I have done some simple translation from Chinese to English:
> >
> > 1. Documentations! Until now, the online doc still contains:
> > 1. Depreciated but not updated doc
> > 2. Wrong documentation with poor description
> > 3. Document in Alpha stage such as you must install `pip
> > –pre` in order to run.
> >
> > 2. Examples! For Gluon specifically, many examples are still mixing
> > Gluon/MXNet apis. The mixure of mx.sym, mx.nd mx.gluon confused the users
> > of what is the right one to choose in order to get their model to work.
> As
> > an example, Although Gluon made data encapsulation possible, still there
> > are examples using mxn.io.ImageRecordIter with tens of params (feels like
> > gluon examples are simply the copy from old Python examples).
> >
> > 3. Examples again! Comparing to PyTorch, there are a few examples I don't
> > like in Gluon:
> > 1. Available to run however the code structure is still
> > very complicated. Such as example/image-classification/cifar10.py. It
> > seemed like a consecutive code concatenation. In fact, these are just a
> > series of layers mixed with model.fit. It makes user very hard to
> > modify/extend the model.
> > 2. Only available to run with certain settings. If users
> > try to change a little bit in the model, crashes will happen. For
> example,
> > the multi-gpu example in Gluon website, MXNet hide the logic that using
> > batch size to change learning rate in a optimizer. A lot of newbies
> didn't
> > know this fact and they would only find that the model stopped converging
> > when batch size changed.
> > 3. The worst scenario is the model itself just simply
> > didn't work. Maintainers in the MXNet community didn't run the model
> (even
> > no integration test) and merge the code directly. It makes the script not
> > able run till somebody raise the issues and fix it.
> >
> > 4. The Community problem. The core advantage for MXNet is it's
> scalability
> > and efficiency. However, the documentation of some tools are confusing.
> > Here are two examples:
> >
> > 1. im2rec contains 2 versions, C++ (binary) and python.
> > But nobody would thought that the argparse in these tools are different
> (in
> > the meantime, there is no appropriate examples to compare with, users
> could
> > only use them by guessing the usage).
> >
> > 2. How to combine MXNet distributed platform with
> > supercomputing tool such as Slurm? How do we do profiling and how to
> debug.
> > A couples of companies I knew thought of using MXNet for distributed
> > training. Due to lack of examples and poor support from the community,
> they
> > have to change their models into TensorFlow and Horovod.
> >
> > 5. The heavy code base. Most of the MXNet examples/source
> > code/documentation/language binding are in a single repo. A git clone
> > operation will cost tens of Mb. The New feature PR would takes longer
> time
> > than expected. The poor reviewing response / rules keeps new contributors
> > away from the community. I remember there was a call for
> > document-improvement last year. The total timeline cost a user 3 months
> of
> > time to merge into master. It almost equals to a release interval of
> > Pytorch.
> >
> > 6. To Developers. There are very few people in the community discussed
> the
> > improvement we can take to make MXNet more user-friendly. It's been so
> easy
> > to trigger tens of stack issues during coding. Again, is that a
> 

RE: Remove MKLML as dependency

2018-09-20 Thread Lv, Tao A
The name mxnet-mkl was used for MKL2017 integration before it was replaced by 
MKL-DNN integration in 1.2.0 release. To provide consistent experience to the 
users, we reused this name to deliver MXNet + MKL-DNN pip package.

Actually, MKL-DNN doesn't fulfill BLAS requirements. MKL-DNN itself doesn't 
have any BLAS functionalities. We still need a BLAS library (currently in 
Makefile, should be one of OpenBLAS/Atlas/MKL) to build MXNet even MKL-DNN is 
enabled.

We talked about building mxnet-mkl package with USE_BLAS=mkl instead, but 
finally we met the license issue of MKL if we want to redistribute it with 
MXNet.

Hope these can answer your questions.
-tao

-Original Message-
From: Aaron Markham [mailto:aaron.s.mark...@gmail.com] 
Sent: Thursday, September 20, 2018 11:26 PM
To: dev@mxnet.incubator.apache.org
Subject: Re: Remove MKLML as dependency

I find it unintuitive that mxnet-mkl doesn't actually ship with MKL. Why isn't 
it called mxnet-mkldnn instead?

Side note, if mkldnn fulfills BLAS requirements, then why can't we strip out 
OpenBLAS for the "mxnet-mkl" package? Is there no way to make the submodules 
conform to using mkldnn? All in the spirit of simplifying things...limiting the 
deps...

On Sep 20, 2018 07:41, "Lv, Tao A"  wrote:

Hah, seems it's a little confusing here. I think the "Intel MKL" in the first 
statement includes both the full MKL and MKLML library. And the "dynamic 
library" there obviously means the MKLML which is delivered in MKL-DNN repo.

MKLML is a subset of full MKL and includes all BLAS functions for both single 
precision and double precision. From this point of view, I think it can be used 
as a BLAS library, but cannot be used as full MKL.


-tao

-Original Message-
From: Chris Olivier [mailto:cjolivie...@gmail.com]
Sent: Thursday, September 20, 2018 9:36 PM
To: dev@mxnet.incubator.apache.org
Subject: Re: Remove MKLML as dependency

thanks for the info. I am still a little confused — your statement said “MKL” 
and not “MKLML”, so my question is still the same.  Are GEMMS in MKLML or just 
MKL? I know MKLML doesn’t have a blas library like the main MKL.

On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:

> Hi Chris, please kindly check the statements here:
> https://github.com/intel/mkl-dnn#installation
>
> " Intel MKL-DNN can take advantage of optimized matrix-matrix 
> multiplication (GEMM) function from Intel MKL. The dynamic library 
> with this functionality is included in the repository. "
>
> " You can choose to build Intel MKL-DNN without binary dependency. The 
> resulting version will be fully functional, however performance of 
> certain convolution shapes and sizes and inner product relying on 
> SGEMM function may be suboptimal."
>
> -tao
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Thursday, September 20, 2018 11:20 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> maybe I missed it, but what does MKLML have that mkldnn doesn’t have 
> that makes it necessary?
>
> what’s the motivation for removing it?
>
> On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
>
> > If you just want to test the performance, I think you need link MKL 
> > for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better 
> > performance.
> >
> > Here are some ways for you to install full MKL library if you don't 
> > have
> > one:
> > 1. Register and download from intel website:
> > https://software.intel.com/en-us/mkl
> > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > a.
> >
> https://software.intel.com/en-us/articles/installing-intel-free-libs-a
> nd-python-yum-repo
> > b. https://software.intel.com/en-us/articles/
> > thatinstalling-intel-free-libs-and-python-apt-repo
> >  > s-
> > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package 
> > and-python-apt-repo> has

> > the runtime and ‘mkl-devel’ includes everything with the headers
> > a.
> > https://software.intel.com/en-us/articles/installing-the-intel-distr
> > ib ution-for-python-and-intel-performance-libraries-with-pip-and

> > 4. conda install: also has mkl and mkl-devel
> > a. https://anaconda.org/intel/mkl
> > b. https://anaconda.org/intel/mkl-devel
> >
> > If you want to redistribute MKL with MXNet, you may need take care 
> > of the license issue. Currently, MKL is using ISSL ( 
> > https://software.intel.com/en-us/license/intel-simplified-software-l
> > ic
> > ense
> > ).
> >
> > -Original Message-
> > From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> > Sent: Wednesday, September 19, 2018 12:49 PM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > Will test it out tomorrow.
> >
> > On the side, what is the best way to test MKL build for MXnet. MKL 
> > is licensed?
> >
> > Best,
> > Alex
> >
> > On 9/18/18, 

Re: Remove MKLML as dependency

2018-09-20 Thread Chris Olivier
MKLML does not have a complete blas library and if you don’t link in
another blas library like open blas, some functions will blow up (ie some
of the linalg functions).

I was not aware of MKLML still being required with MKLDNN. I’ve never
gotten a definitive answer about this from Da, although I’ve asked a couple
of times.

What does Da say about all of this?

Unless there’s good reason to the contrary, removing MKLML and requiring
the larger, strangely licensed standalone MKL for everyone seems a bit
heavy-handed.

On Thu, Sep 20, 2018 at 7:41 AM Lv, Tao A  wrote:

> Hah, seems it's a little confusing here. I think the "Intel MKL" in the
> first statement includes both the full MKL and MKLML library. And the
> "dynamic library" there obviously means the MKLML which is delivered in
> MKL-DNN repo.
>
> MKLML is a subset of full MKL and includes all BLAS functions for both
> single precision and double precision. From this point of view, I think it
> can be used as a BLAS library, but cannot be used as full MKL.
>
> -tao
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Thursday, September 20, 2018 9:36 PM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> thanks for the info. I am still a little confused — your statement said
> “MKL” and not “MKLML”, so my question is still the same.  Are GEMMS in
> MKLML or just MKL? I know MKLML doesn’t have a blas library like the main
> MKL.
>
> On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:
>
> > Hi Chris, please kindly check the statements here:
> > https://github.com/intel/mkl-dnn#installation
> >
> > " Intel MKL-DNN can take advantage of optimized matrix-matrix
> > multiplication (GEMM) function from Intel MKL. The dynamic library
> > with this functionality is included in the repository. "
> >
> > " You can choose to build Intel MKL-DNN without binary dependency. The
> > resulting version will be fully functional, however performance of
> > certain convolution shapes and sizes and inner product relying on
> > SGEMM function may be suboptimal."
> >
> > -tao
> >
> > -Original Message-
> > From: Chris Olivier [mailto:cjolivie...@gmail.com]
> > Sent: Thursday, September 20, 2018 11:20 AM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > maybe I missed it, but what does MKLML have that mkldnn doesn’t have
> > that makes it necessary?
> >
> > what’s the motivation for removing it?
> >
> > On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
> >
> > > If you just want to test the performance, I think you need link MKL
> > > for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better
> > > performance.
> > >
> > > Here are some ways for you to install full MKL library if you don't
> > > have
> > > one:
> > > 1. Register and download from intel website:
> > > https://software.intel.com/en-us/mkl
> > > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > > a.
> > >
> > https://software.intel.com/en-us/articles/installing-intel-free-libs-a
> > nd-python-yum-repo
> > > b. https://software.intel.com/en-us/articles/
> > > thatinstalling-intel-free-libs-and-python-apt-repo
> > >  > > s-
> > > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package
> > > and-python-apt-repo> has
> > > the runtime and ‘mkl-devel’ includes everything with the headers
> > > a.
> > > https://software.intel.com/en-us/articles/installing-the-intel-distr
> > > ib ution-for-python-and-intel-performance-libraries-with-pip-and
> > > 4. conda install: also has mkl and mkl-devel
> > > a. https://anaconda.org/intel/mkl
> > > b. https://anaconda.org/intel/mkl-devel
> > >
> > > If you want to redistribute MKL with MXNet, you may need take care
> > > of the license issue. Currently, MKL is using ISSL (
> > > https://software.intel.com/en-us/license/intel-simplified-software-l
> > > ic
> > > ense
> > > ).
> > >
> > > -Original Message-
> > > From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> > > Sent: Wednesday, September 19, 2018 12:49 PM
> > > To: dev@mxnet.incubator.apache.org
> > > Subject: Re: Remove MKLML as dependency
> > >
> > > Will test it out tomorrow.
> > >
> > > On the side, what is the best way to test MKL build for MXnet. MKL
> > > is licensed?
> > >
> > > Best,
> > > Alex
> > >
> > > On 9/18/18, 7:50 PM, "Lv, Tao A"  wrote:
> > >
> > > Hi Alex,
> > >
> > > Thanks for bringing this up.
> > >
> > > The original intention of MKLML is to provide a light and
> > > easy-to-access library for ML/DL community. It's released with
> > > MKL-DNN under Apache-2.0 license.
> > >
> > > AFAIK, MKL-DNN still relies on it for better performance. So I'm
> > > afraid there will be a performance regression in MKL pip packages if
> > > MKLML is simply removed.
> > >
> > > Have you ever tried the build without 

Re: Some feedback from MXNet Zhihu topic

2018-09-20 Thread Tianqi Chen
The key complain here is mainly about the clarity of the documents
themselves. Maybe it is time to focus on a single flavor of API that is
useful(Gluon) and highlight all the docs around that

Tianqi


On Wed, Sep 19, 2018 at 11:04 AM Qing Lan  wrote:

> Hi all,
>
> There was a trend topic in
> Zhihu (a famous Chinese Stackoverflow+Quora) asking about the status of
> MXNet in 2018 recently. Mu replied the thread and obtained more than 300+
> `like`.
> However there are a few concerns addressed in the comments of this thread,
> I have done some simple translation from Chinese to English:
>
> 1. Documentations! Until now, the online doc still contains:
> 1. Depreciated but not updated doc
> 2. Wrong documentation with poor description
> 3. Document in Alpha stage such as you must install `pip
> –pre` in order to run.
>
> 2. Examples! For Gluon specifically, many examples are still mixing
> Gluon/MXNet apis. The mixure of mx.sym, mx.nd mx.gluon confused the users
> of what is the right one to choose in order to get their model to work. As
> an example, Although Gluon made data encapsulation possible, still there
> are examples using mxn.io.ImageRecordIter with tens of params (feels like
> gluon examples are simply the copy from old Python examples).
>
> 3. Examples again! Comparing to PyTorch, there are a few examples I don't
> like in Gluon:
> 1. Available to run however the code structure is still
> very complicated. Such as example/image-classification/cifar10.py. It
> seemed like a consecutive code concatenation. In fact, these are just a
> series of layers mixed with model.fit. It makes user very hard to
> modify/extend the model.
> 2. Only available to run with certain settings. If users
> try to change a little bit in the model, crashes will happen. For example,
> the multi-gpu example in Gluon website, MXNet hide the logic that using
> batch size to change learning rate in a optimizer. A lot of newbies didn't
> know this fact and they would only find that the model stopped converging
> when batch size changed.
> 3. The worst scenario is the model itself just simply
> didn't work. Maintainers in the MXNet community didn't run the model (even
> no integration test) and merge the code directly. It makes the script not
> able run till somebody raise the issues and fix it.
>
> 4. The Community problem. The core advantage for MXNet is it's scalability
> and efficiency. However, the documentation of some tools are confusing.
> Here are two examples:
>
> 1. im2rec contains 2 versions, C++ (binary) and python.
> But nobody would thought that the argparse in these tools are different (in
> the meantime, there is no appropriate examples to compare with, users could
> only use them by guessing the usage).
>
> 2. How to combine MXNet distributed platform with
> supercomputing tool such as Slurm? How do we do profiling and how to debug.
> A couples of companies I knew thought of using MXNet for distributed
> training. Due to lack of examples and poor support from the community, they
> have to change their models into TensorFlow and Horovod.
>
> 5. The heavy code base. Most of the MXNet examples/source
> code/documentation/language binding are in a single repo. A git clone
> operation will cost tens of Mb. The New feature PR would takes longer time
> than expected. The poor reviewing response / rules keeps new contributors
> away from the community. I remember there was a call for
> document-improvement last year. The total timeline cost a user 3 months of
> time to merge into master. It almost equals to a release interval of
> Pytorch.
>
> 6. To Developers. There are very few people in the community discussed the
> improvement we can take to make MXNet more user-friendly. It's been so easy
> to trigger tens of stack issues during coding. Again, is that a requirement
> for MXNet users to be familiar with C++? The connection between Python and
> C lacks a IDE lint (maybe MXNet assume every developers as a VIM master).
> API/underlying implementation chaged frequently. People have to release
> their code with an achieved version of MXNet (such as TuSimple and MSRA).
> Let's take a look at PyTorch, an API used move tensor to device would raise
> a thorough discussion.
>
> There will be more comments translated to English and I will keep this
> thread updated…
> Thanks,
> Qing
>


Re: Some feedback from MXNet Zhihu topic

2018-09-20 Thread Tianqi Chen
Thanks for the great feedbacks. I want to point out though that the cost of
building Mxnet is mainly on the operators that sit on Mxnet repo, rather
than its submodules

Tianqi


On Thu, Sep 20, 2018 at 1:10 AM Naveen Swamy  wrote:

> Qing,
>
> this is so loaded and very specific suggestions. Thank you for bringing up
> here, since Apache MXNet is popular in China, It would be great if Mandrin
> speaking developers here could bring such feedback and user pain to the
> community's attention.
>
> 1. To capture specific API/Example/Tutorial that users have an issue on, Mu
> suggested in the past to add thumbs up/down on the website:
> https://issues.apache.org/jira/browse/MXNET-972
>
> 6. The heavy code base is not because of the code in the MXNet repo, its
> all the sub-modules that are added to the repo - I have had this problem
> too, to build MXNet i have to fetch and build the whole world that MXNet
> depends on and its dependency(sub within sub) - I think its time to revisit
> and refactor.
>
> For others I suggest you work with someone to create actionable JIRAs(may
> be Denis - because he knowledgable JIRA and creates nice actionable
> stories), it would be nice if these stories can contain many
> first-good-issue tasks for new contributors to pick up - creating
> standalone examples(from existing) is a great one for newbies to learn
> MXNet and contribute back.
>
> Examples are very important for someone to not only quickly learn but also
> extend/adopt to their own application, In Scala we(you) have added tests
> around Examples and actually use them as integration tests - we should do
> insist the same for new examples written or old examples that we touch .
>
> In Deep Learning what is more critical and could increase rapid adoption is
> to have the latest and greatest papers implemented as examples - this is a
> call for suggestions and Action to the community.
>
> Thanks, Naveen
>
>
> On Wed, Sep 19, 2018 at 10:39 PM, Aaron Markham  >
> wrote:
>
> > Thanks for this translation and feedback Qing!
> > I've addressed point 3 of the documentation feedback with this PR:
> > https://github.com/apache/incubator-mxnet/pull/12604
> > I'm not sure how to take the first two points without some explicit URLs
> > and examples, so if anyone has those I'd be happy to take a look if
> there's
> > some glitch vs missing or wrong docs.
> >
> > Also, I would agree that there should be some more simple examples. Often
> > times the examples are too complicated and unclear about what is
> important
> > or not. The audience targeting is for deep learning practitioners, not
> > "newbies".
> >
> > And on a related note, I'd really like to pull the Gluon stuff into the
> API
> > section. It's confusing as its own navigation item and orphaned
> > information. It could have a navigation entry at the top of the API list
> > like "Python: Gluon" or just "Gluon" then list "Python: Module" or just
> > "Python". Or running this the other way, the Gluon menu could have API
> and
> > Tutorials and be more fleshed out, though this is not my preference.
> Either
> > way, it needs some attention.
> >
> > Cheers,
> > Aaron
> >
> > On Wed, Sep 19, 2018 at 11:04 AM Qing Lan  wrote:
> >
> > > Hi all,
> > >
> > > There was a trend topic in
> > > Zhihu (a famous Chinese Stackoverflow+Quora) asking about the status of
> > > MXNet in 2018 recently. Mu replied the thread and obtained more than
> 300+
> > > `like`.
> > > However there are a few concerns addressed in the comments of this
> > thread,
> > > I have done some simple translation from Chinese to English:
> > >
> > > 1. Documentations! Until now, the online doc still contains:
> > > 1. Depreciated but not updated doc
> > > 2. Wrong documentation with poor description
> > > 3. Document in Alpha stage such as you must install
> `pip
> > > –pre` in order to run.
> > >
> > > 2. Examples! For Gluon specifically, many examples are still mixing
> > > Gluon/MXNet apis. The mixure of mx.sym, mx.nd mx.gluon confused the
> users
> > > of what is the right one to choose in order to get their model to work.
> > As
> > > an example, Although Gluon made data encapsulation possible, still
> there
> > > are examples using mxn.io.ImageRecordIter with tens of params (feels
> like
> > > gluon examples are simply the copy from old Python examples).
> > >
> > > 3. Examples again! Comparing to PyTorch, there are a few examples I
> don't
> > > like in Gluon:
> > > 1. Available to run however the code structure is still
> > > very complicated. Such as example/image-classification/cifar10.py. It
> > > seemed like a consecutive code concatenation. In fact, these are just a
> > > series of layers mixed with model.fit. It makes user very hard to
> > > modify/extend the model.
> > > 2. Only available to run with certain settings. If
> users
> > > try to change a little bit in the model, 

Re: multiple installation guides?

2018-09-20 Thread Marco de Abreu
I'd add a paragraph about requesting a small root cause investigation why
the web-server diverged from the repository if you don't mind:


Apache Infra,
Please remove the /test/ directory found at
https://mxnet.incubator.apache.org/test
This directory is not found in the
https://github.com/apache/incubator-mxnet-site repository, yet it has web
traffic going to this neglected folder.
Are there any other folders on the web server that might also fit in this
category? If so, please advise and remove if they appear to be similar
"test" folders.

We would appreciate it if you could help us root-causing this problem to
avoid further similar incidents.
~~~

I have forwarded the email to Sebastian.

Best regards,
Marco

On Thu, Sep 20, 2018 at 4:42 PM Aaron Markham 
wrote:

> Sure...
> ~~
> Apache Infra,
> Please remove the /test/ directory found at
> https://mxnet.incubator.apache.org/test
> This directory is not found in the
> https://github.com/apache/incubator-mxnet-site repository, yet it has web
> traffic going to this neglected folder.
> Are there any other folders on the web server that might also fit in this
> category? If so, please advise and remove if they appear to be similar
> "test" folders.
>
> Thanks,
> Aaron
>
> On Thu, Sep 20, 2018 at 6:00 AM Marco de Abreu
>  wrote:
>
> > Sure. Aaron, can you write something down that can be copy and pasted
> into
> > a ticket so that our mentors just have to create it?
> >
> > I think we could ask Infra for an investigation about *why* this
> happened,
> > considering the fact that it should be an exact mirror, and tools to
> > self-service these kind of things.
> >
> > -Marco
> >
> > On Wed, Sep 19, 2018 at 4:11 PM Aaron Markham  >
> > wrote:
> >
> > > It's not on the site repo. Seems like it is only on the Apache infra.
> Can
> > > someone request it's removal?
> > >
> > > On Tue, Sep 18, 2018, 20:34 Hagay Lupesko  wrote:
> > >
> > > > The /test site seems to be something old that should have been
> removed
> > a
> > > > long time ago, it lists versions 0.10 and 0.10.14 :)
> > > > Maybe Aaron has an idea what needs to be done to remove it...
> > > >
> > > > On Fri, Sep 14, 2018 at 4:55 PM Alex Zai  wrote:
> > > >
> > > > > Why do we have two sets of installation guides?
> > > > >
> > > > > http://mxnet.incubator.apache.org/test/get_started/install.html
> > > > >
> > > > >
> > > >
> > >
> >
> https://mxnet.incubator.apache.org/install/index.html?platform=Linux=Python=CPU
> > > > >
> > > > > The /test domain is also not secure. If this is not suppose to be
> > > > > public we should remove this as it is confusing.
> > > > >
> > > >
> > >
> >
>


Re: Remove MKLML as dependency

2018-09-20 Thread Aaron Markham
I find it unintuitive that mxnet-mkl doesn't actually ship with MKL. Why
isn't it called mxnet-mkldnn instead?

Side note, if mkldnn fulfills BLAS requirements, then why can't we strip
out OpenBLAS for the "mxnet-mkl" package? Is there no way to make the
submodules conform to using mkldnn? All in the spirit of simplifying
things...limiting the deps...

On Sep 20, 2018 07:41, "Lv, Tao A"  wrote:

Hah, seems it's a little confusing here. I think the "Intel MKL" in the
first statement includes both the full MKL and MKLML library. And the
"dynamic library" there obviously means the MKLML which is delivered in
MKL-DNN repo.

MKLML is a subset of full MKL and includes all BLAS functions for both
single precision and double precision. From this point of view, I think it
can be used as a BLAS library, but cannot be used as full MKL.


-tao

-Original Message-
From: Chris Olivier [mailto:cjolivie...@gmail.com]
Sent: Thursday, September 20, 2018 9:36 PM
To: dev@mxnet.incubator.apache.org
Subject: Re: Remove MKLML as dependency

thanks for the info. I am still a little confused — your statement said
“MKL” and not “MKLML”, so my question is still the same.  Are GEMMS in
MKLML or just MKL? I know MKLML doesn’t have a blas library like the main
MKL.

On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:

> Hi Chris, please kindly check the statements here:
> https://github.com/intel/mkl-dnn#installation
>
> " Intel MKL-DNN can take advantage of optimized matrix-matrix
> multiplication (GEMM) function from Intel MKL. The dynamic library
> with this functionality is included in the repository. "
>
> " You can choose to build Intel MKL-DNN without binary dependency. The
> resulting version will be fully functional, however performance of
> certain convolution shapes and sizes and inner product relying on
> SGEMM function may be suboptimal."
>
> -tao
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Thursday, September 20, 2018 11:20 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> maybe I missed it, but what does MKLML have that mkldnn doesn’t have
> that makes it necessary?
>
> what’s the motivation for removing it?
>
> On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
>
> > If you just want to test the performance, I think you need link MKL
> > for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better
> > performance.
> >
> > Here are some ways for you to install full MKL library if you don't
> > have
> > one:
> > 1. Register and download from intel website:
> > https://software.intel.com/en-us/mkl
> > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > a.
> >
> https://software.intel.com/en-us/articles/installing-intel-free-libs-a
> nd-python-yum-repo
> > b. https://software.intel.com/en-us/articles/
> > thatinstalling-intel-free-libs-and-python-apt-repo
> >  > s-
> > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package
> > and-python-apt-repo> has

> > the runtime and ‘mkl-devel’ includes everything with the headers
> > a.
> > https://software.intel.com/en-us/articles/installing-the-intel-distr
> > ib ution-for-python-and-intel-performance-libraries-with-pip-and

> > 4. conda install: also has mkl and mkl-devel
> > a. https://anaconda.org/intel/mkl
> > b. https://anaconda.org/intel/mkl-devel
> >
> > If you want to redistribute MKL with MXNet, you may need take care
> > of the license issue. Currently, MKL is using ISSL (
> > https://software.intel.com/en-us/license/intel-simplified-software-l
> > ic
> > ense
> > ).
> >
> > -Original Message-
> > From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> > Sent: Wednesday, September 19, 2018 12:49 PM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > Will test it out tomorrow.
> >
> > On the side, what is the best way to test MKL build for MXnet. MKL
> > is licensed?
> >
> > Best,
> > Alex
> >
> > On 9/18/18, 7:50 PM, "Lv, Tao A"  wrote:
> >
> > Hi Alex,
> >
> > Thanks for bringing this up.
> >
> > The original intention of MKLML is to provide a light and
> > easy-to-access library for ML/DL community. It's released with
> > MKL-DNN under Apache-2.0 license.
> >
> > AFAIK, MKL-DNN still relies on it for better performance. So I'm
> > afraid there will be a performance regression in MKL pip packages if
> > MKLML is simply removed.
> >
> > Have you ever tried the build without MKLML and how does the
> > performance look like?
> >
> > -tao
> >
> > -Original Message-
> > From: Alex Zai [mailto:aza...@gmail.com]
> > Sent: Wednesday, September 19, 2018 4:49 AM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Remove MKLML as dependency
> >
> > On our build from source page we have a list of blas libraries
> > that are recommended:
> >
> 

Re: multiple installation guides?

2018-09-20 Thread Aaron Markham
Sure...
~~
Apache Infra,
Please remove the /test/ directory found at
https://mxnet.incubator.apache.org/test
This directory is not found in the
https://github.com/apache/incubator-mxnet-site repository, yet it has web
traffic going to this neglected folder.
Are there any other folders on the web server that might also fit in this
category? If so, please advise and remove if they appear to be similar
"test" folders.

Thanks,
Aaron

On Thu, Sep 20, 2018 at 6:00 AM Marco de Abreu
 wrote:

> Sure. Aaron, can you write something down that can be copy and pasted into
> a ticket so that our mentors just have to create it?
>
> I think we could ask Infra for an investigation about *why* this happened,
> considering the fact that it should be an exact mirror, and tools to
> self-service these kind of things.
>
> -Marco
>
> On Wed, Sep 19, 2018 at 4:11 PM Aaron Markham 
> wrote:
>
> > It's not on the site repo. Seems like it is only on the Apache infra. Can
> > someone request it's removal?
> >
> > On Tue, Sep 18, 2018, 20:34 Hagay Lupesko  wrote:
> >
> > > The /test site seems to be something old that should have been removed
> a
> > > long time ago, it lists versions 0.10 and 0.10.14 :)
> > > Maybe Aaron has an idea what needs to be done to remove it...
> > >
> > > On Fri, Sep 14, 2018 at 4:55 PM Alex Zai  wrote:
> > >
> > > > Why do we have two sets of installation guides?
> > > >
> > > > http://mxnet.incubator.apache.org/test/get_started/install.html
> > > >
> > > >
> > >
> >
> https://mxnet.incubator.apache.org/install/index.html?platform=Linux=Python=CPU
> > > >
> > > > The /test domain is also not secure. If this is not suppose to be
> > > > public we should remove this as it is confusing.
> > > >
> > >
> >
>


RE: Remove MKLML as dependency

2018-09-20 Thread Lv, Tao A
Hah, seems it's a little confusing here. I think the "Intel MKL" in the first 
statement includes both the full MKL and MKLML library. And the "dynamic 
library" there obviously means the MKLML which is delivered in MKL-DNN repo. 

MKLML is a subset of full MKL and includes all BLAS functions for both single 
precision and double precision. From this point of view, I think it can be used 
as a BLAS library, but cannot be used as full MKL.

-tao

-Original Message-
From: Chris Olivier [mailto:cjolivie...@gmail.com] 
Sent: Thursday, September 20, 2018 9:36 PM
To: dev@mxnet.incubator.apache.org
Subject: Re: Remove MKLML as dependency

thanks for the info. I am still a little confused — your statement said “MKL” 
and not “MKLML”, so my question is still the same.  Are GEMMS in MKLML or just 
MKL? I know MKLML doesn’t have a blas library like the main MKL.

On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:

> Hi Chris, please kindly check the statements here:
> https://github.com/intel/mkl-dnn#installation
>
> " Intel MKL-DNN can take advantage of optimized matrix-matrix 
> multiplication (GEMM) function from Intel MKL. The dynamic library 
> with this functionality is included in the repository. "
>
> " You can choose to build Intel MKL-DNN without binary dependency. The 
> resulting version will be fully functional, however performance of 
> certain convolution shapes and sizes and inner product relying on 
> SGEMM function may be suboptimal."
>
> -tao
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Thursday, September 20, 2018 11:20 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> maybe I missed it, but what does MKLML have that mkldnn doesn’t have 
> that makes it necessary?
>
> what’s the motivation for removing it?
>
> On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
>
> > If you just want to test the performance, I think you need link MKL 
> > for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better 
> > performance.
> >
> > Here are some ways for you to install full MKL library if you don't 
> > have
> > one:
> > 1. Register and download from intel website:
> > https://software.intel.com/en-us/mkl
> > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > a.
> >
> https://software.intel.com/en-us/articles/installing-intel-free-libs-a
> nd-python-yum-repo
> > b. https://software.intel.com/en-us/articles/
> > thatinstalling-intel-free-libs-and-python-apt-repo
> >  > s-
> > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package 
> > and-python-apt-repo> has
> > the runtime and ‘mkl-devel’ includes everything with the headers
> > a.
> > https://software.intel.com/en-us/articles/installing-the-intel-distr
> > ib ution-for-python-and-intel-performance-libraries-with-pip-and
> > 4. conda install: also has mkl and mkl-devel
> > a. https://anaconda.org/intel/mkl
> > b. https://anaconda.org/intel/mkl-devel
> >
> > If you want to redistribute MKL with MXNet, you may need take care 
> > of the license issue. Currently, MKL is using ISSL ( 
> > https://software.intel.com/en-us/license/intel-simplified-software-l
> > ic
> > ense
> > ).
> >
> > -Original Message-
> > From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> > Sent: Wednesday, September 19, 2018 12:49 PM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > Will test it out tomorrow.
> >
> > On the side, what is the best way to test MKL build for MXnet. MKL 
> > is licensed?
> >
> > Best,
> > Alex
> >
> > On 9/18/18, 7:50 PM, "Lv, Tao A"  wrote:
> >
> > Hi Alex,
> >
> > Thanks for bringing this up.
> >
> > The original intention of MKLML is to provide a light and 
> > easy-to-access library for ML/DL community. It's released with 
> > MKL-DNN under Apache-2.0 license.
> >
> > AFAIK, MKL-DNN still relies on it for better performance. So I'm 
> > afraid there will be a performance regression in MKL pip packages if 
> > MKLML is simply removed.
> >
> > Have you ever tried the build without MKLML and how does the 
> > performance look like?
> >
> > -tao
> >
> > -Original Message-
> > From: Alex Zai [mailto:aza...@gmail.com]
> > Sent: Wednesday, September 19, 2018 4:49 AM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Remove MKLML as dependency
> >
> > On our build from source page we have a list of blas libraries 
> > that are recommended:
> > 
> > https://mxnet.incubator.apache.org/install/build_from_source.html
> >
> > MKL-DNN
> > MKL
> > MKLML
> > Apple Accelerate
> > OpenBlas
> >
> > MKLML is a subset of MKL (
> https://github.com/intel/mkl-dnn/issues/102)
> > and therefore MKLML users can just use MKL instead. Does anyone 
> > see an issue with me removing this? It would 

Re: Remove MKLML as dependency

2018-09-20 Thread Chris Olivier
thanks for the info. I am still a little confused — your statement said
“MKL” and not “MKLML”, so my question is still the same.  Are GEMMS in
MKLML or just MKL? I know MKLML doesn’t have a blas library like the main
MKL.

On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A  wrote:

> Hi Chris, please kindly check the statements here:
> https://github.com/intel/mkl-dnn#installation
>
> " Intel MKL-DNN can take advantage of optimized matrix-matrix
> multiplication (GEMM) function from Intel MKL. The dynamic library with
> this functionality is included in the repository. "
>
> " You can choose to build Intel MKL-DNN without binary dependency. The
> resulting version will be fully functional, however performance of certain
> convolution shapes and sizes and inner product relying on SGEMM function
> may be suboptimal."
>
> -tao
>
> -Original Message-
> From: Chris Olivier [mailto:cjolivie...@gmail.com]
> Sent: Thursday, September 20, 2018 11:20 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> maybe I missed it, but what does MKLML have that mkldnn doesn’t have that
> makes it necessary?
>
> what’s the motivation for removing it?
>
> On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:
>
> > If you just want to test the performance, I think you need link MKL
> > for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better
> > performance.
> >
> > Here are some ways for you to install full MKL library if you don't
> > have
> > one:
> > 1. Register and download from intel website:
> > https://software.intel.com/en-us/mkl
> > 2. Apt-get/yum: currently it need configure Intel’s repositories.
> > a.
> >
> https://software.intel.com/en-us/articles/installing-intel-free-libs-and-python-yum-repo
> > b. https://software.intel.com/en-us/articles/
> > thatinstalling-intel-free-libs-and-python-apt-repo
> >  > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package has
> > the runtime and ‘mkl-devel’ includes everything with the headers
> > a.
> > https://software.intel.com/en-us/articles/installing-the-intel-distrib
> > ution-for-python-and-intel-performance-libraries-with-pip-and
> > 4. conda install: also has mkl and mkl-devel
> > a. https://anaconda.org/intel/mkl
> > b. https://anaconda.org/intel/mkl-devel
> >
> > If you want to redistribute MKL with MXNet, you may need take care of
> > the license issue. Currently, MKL is using ISSL (
> > https://software.intel.com/en-us/license/intel-simplified-software-lic
> > ense
> > ).
> >
> > -Original Message-
> > From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> > Sent: Wednesday, September 19, 2018 12:49 PM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Re: Remove MKLML as dependency
> >
> > Will test it out tomorrow.
> >
> > On the side, what is the best way to test MKL build for MXnet. MKL is
> > licensed?
> >
> > Best,
> > Alex
> >
> > On 9/18/18, 7:50 PM, "Lv, Tao A"  wrote:
> >
> > Hi Alex,
> >
> > Thanks for bringing this up.
> >
> > The original intention of MKLML is to provide a light and
> > easy-to-access library for ML/DL community. It's released with MKL-DNN
> > under Apache-2.0 license.
> >
> > AFAIK, MKL-DNN still relies on it for better performance. So I'm
> > afraid there will be a performance regression in MKL pip packages if
> > MKLML is simply removed.
> >
> > Have you ever tried the build without MKLML and how does the
> > performance look like?
> >
> > -tao
> >
> > -Original Message-
> > From: Alex Zai [mailto:aza...@gmail.com]
> > Sent: Wednesday, September 19, 2018 4:49 AM
> > To: dev@mxnet.incubator.apache.org
> > Subject: Remove MKLML as dependency
> >
> > On our build from source page we have a list of blas libraries
> > that are recommended:
> > https://mxnet.incubator.apache.org/install/build_from_source.html
> >
> > MKL-DNN
> > MKL
> > MKLML
> > Apple Accelerate
> > OpenBlas
> >
> > MKLML is a subset of MKL (
> https://github.com/intel/mkl-dnn/issues/102)
> > and therefore MKLML users can just use MKL instead. Does anyone
> > see an issue with me removing this? It would simplify out doc page and
> build file.
> >
> > Alex
> >
> >
> >
>


Re: Some feedback from MXNet Zhihu topic

2018-09-20 Thread Carin Meier
Totally agree about the potential huge benefit of having new research
papers having implementation examples in MXNet. Wondering if anyone had any
brainstorm ideas about how to facilitate/ encourage this?

Also wanted to note that I think the recent progress and attention to
stability will help to both speed the PR process and release cycle. There
is more work to do in this area especially in regards to automation of the
release that I think will yield big dividends down the road. Let's keep up
the good work in this area.

- Carin

On Thu, Sep 20, 2018 at 4:10 AM Naveen Swamy  wrote:

> Qing,
>
> this is so loaded and very specific suggestions. Thank you for bringing up
> here, since Apache MXNet is popular in China, It would be great if Mandrin
> speaking developers here could bring such feedback and user pain to the
> community's attention.
>
> 1. To capture specific API/Example/Tutorial that users have an issue on, Mu
> suggested in the past to add thumbs up/down on the website:
> https://issues.apache.org/jira/browse/MXNET-972
>
> 6. The heavy code base is not because of the code in the MXNet repo, its
> all the sub-modules that are added to the repo - I have had this problem
> too, to build MXNet i have to fetch and build the whole world that MXNet
> depends on and its dependency(sub within sub) - I think its time to revisit
> and refactor.
>
> For others I suggest you work with someone to create actionable JIRAs(may
> be Denis - because he knowledgable JIRA and creates nice actionable
> stories), it would be nice if these stories can contain many
> first-good-issue tasks for new contributors to pick up - creating
> standalone examples(from existing) is a great one for newbies to learn
> MXNet and contribute back.
>
> Examples are very important for someone to not only quickly learn but also
> extend/adopt to their own application, In Scala we(you) have added tests
> around Examples and actually use them as integration tests - we should do
> insist the same for new examples written or old examples that we touch .
>
> In Deep Learning what is more critical and could increase rapid adoption is
> to have the latest and greatest papers implemented as examples - this is a
> call for suggestions and Action to the community.
>
> Thanks, Naveen
>
>
> On Wed, Sep 19, 2018 at 10:39 PM, Aaron Markham  >
> wrote:
>
> > Thanks for this translation and feedback Qing!
> > I've addressed point 3 of the documentation feedback with this PR:
> > https://github.com/apache/incubator-mxnet/pull/12604
> > I'm not sure how to take the first two points without some explicit URLs
> > and examples, so if anyone has those I'd be happy to take a look if
> there's
> > some glitch vs missing or wrong docs.
> >
> > Also, I would agree that there should be some more simple examples. Often
> > times the examples are too complicated and unclear about what is
> important
> > or not. The audience targeting is for deep learning practitioners, not
> > "newbies".
> >
> > And on a related note, I'd really like to pull the Gluon stuff into the
> API
> > section. It's confusing as its own navigation item and orphaned
> > information. It could have a navigation entry at the top of the API list
> > like "Python: Gluon" or just "Gluon" then list "Python: Module" or just
> > "Python". Or running this the other way, the Gluon menu could have API
> and
> > Tutorials and be more fleshed out, though this is not my preference.
> Either
> > way, it needs some attention.
> >
> > Cheers,
> > Aaron
> >
> > On Wed, Sep 19, 2018 at 11:04 AM Qing Lan  wrote:
> >
> > > Hi all,
> > >
> > > There was a trend topic in
> > > Zhihu (a famous Chinese Stackoverflow+Quora) asking about the status of
> > > MXNet in 2018 recently. Mu replied the thread and obtained more than
> 300+
> > > `like`.
> > > However there are a few concerns addressed in the comments of this
> > thread,
> > > I have done some simple translation from Chinese to English:
> > >
> > > 1. Documentations! Until now, the online doc still contains:
> > > 1. Depreciated but not updated doc
> > > 2. Wrong documentation with poor description
> > > 3. Document in Alpha stage such as you must install
> `pip
> > > –pre` in order to run.
> > >
> > > 2. Examples! For Gluon specifically, many examples are still mixing
> > > Gluon/MXNet apis. The mixure of mx.sym, mx.nd mx.gluon confused the
> users
> > > of what is the right one to choose in order to get their model to work.
> > As
> > > an example, Although Gluon made data encapsulation possible, still
> there
> > > are examples using mxn.io.ImageRecordIter with tens of params (feels
> like
> > > gluon examples are simply the copy from old Python examples).
> > >
> > > 3. Examples again! Comparing to PyTorch, there are a few examples I
> don't
> > > like in Gluon:
> > > 1. Available to run however the code structure is still
> > > very complicated. Such as 

Re: multiple installation guides?

2018-09-20 Thread Marco de Abreu
Sure. Aaron, can you write something down that can be copy and pasted into
a ticket so that our mentors just have to create it?

I think we could ask Infra for an investigation about *why* this happened,
considering the fact that it should be an exact mirror, and tools to
self-service these kind of things.

-Marco

On Wed, Sep 19, 2018 at 4:11 PM Aaron Markham 
wrote:

> It's not on the site repo. Seems like it is only on the Apache infra. Can
> someone request it's removal?
>
> On Tue, Sep 18, 2018, 20:34 Hagay Lupesko  wrote:
>
> > The /test site seems to be something old that should have been removed a
> > long time ago, it lists versions 0.10 and 0.10.14 :)
> > Maybe Aaron has an idea what needs to be done to remove it...
> >
> > On Fri, Sep 14, 2018 at 4:55 PM Alex Zai  wrote:
> >
> > > Why do we have two sets of installation guides?
> > >
> > > http://mxnet.incubator.apache.org/test/get_started/install.html
> > >
> > >
> >
> https://mxnet.incubator.apache.org/install/index.html?platform=Linux=Python=CPU
> > >
> > > The /test domain is also not secure. If this is not suppose to be
> > > public we should remove this as it is confusing.
> > >
> >
>


Re: Some feedback from MXNet Zhihu topic

2018-09-20 Thread Naveen Swamy
Qing,

this is so loaded and very specific suggestions. Thank you for bringing up
here, since Apache MXNet is popular in China, It would be great if Mandrin
speaking developers here could bring such feedback and user pain to the
community's attention.

1. To capture specific API/Example/Tutorial that users have an issue on, Mu
suggested in the past to add thumbs up/down on the website:
https://issues.apache.org/jira/browse/MXNET-972

6. The heavy code base is not because of the code in the MXNet repo, its
all the sub-modules that are added to the repo - I have had this problem
too, to build MXNet i have to fetch and build the whole world that MXNet
depends on and its dependency(sub within sub) - I think its time to revisit
and refactor.

For others I suggest you work with someone to create actionable JIRAs(may
be Denis - because he knowledgable JIRA and creates nice actionable
stories), it would be nice if these stories can contain many
first-good-issue tasks for new contributors to pick up - creating
standalone examples(from existing) is a great one for newbies to learn
MXNet and contribute back.

Examples are very important for someone to not only quickly learn but also
extend/adopt to their own application, In Scala we(you) have added tests
around Examples and actually use them as integration tests - we should do
insist the same for new examples written or old examples that we touch .

In Deep Learning what is more critical and could increase rapid adoption is
to have the latest and greatest papers implemented as examples - this is a
call for suggestions and Action to the community.

Thanks, Naveen


On Wed, Sep 19, 2018 at 10:39 PM, Aaron Markham 
wrote:

> Thanks for this translation and feedback Qing!
> I've addressed point 3 of the documentation feedback with this PR:
> https://github.com/apache/incubator-mxnet/pull/12604
> I'm not sure how to take the first two points without some explicit URLs
> and examples, so if anyone has those I'd be happy to take a look if there's
> some glitch vs missing or wrong docs.
>
> Also, I would agree that there should be some more simple examples. Often
> times the examples are too complicated and unclear about what is important
> or not. The audience targeting is for deep learning practitioners, not
> "newbies".
>
> And on a related note, I'd really like to pull the Gluon stuff into the API
> section. It's confusing as its own navigation item and orphaned
> information. It could have a navigation entry at the top of the API list
> like "Python: Gluon" or just "Gluon" then list "Python: Module" or just
> "Python". Or running this the other way, the Gluon menu could have API and
> Tutorials and be more fleshed out, though this is not my preference. Either
> way, it needs some attention.
>
> Cheers,
> Aaron
>
> On Wed, Sep 19, 2018 at 11:04 AM Qing Lan  wrote:
>
> > Hi all,
> >
> > There was a trend topic in
> > Zhihu (a famous Chinese Stackoverflow+Quora) asking about the status of
> > MXNet in 2018 recently. Mu replied the thread and obtained more than 300+
> > `like`.
> > However there are a few concerns addressed in the comments of this
> thread,
> > I have done some simple translation from Chinese to English:
> >
> > 1. Documentations! Until now, the online doc still contains:
> > 1. Depreciated but not updated doc
> > 2. Wrong documentation with poor description
> > 3. Document in Alpha stage such as you must install `pip
> > –pre` in order to run.
> >
> > 2. Examples! For Gluon specifically, many examples are still mixing
> > Gluon/MXNet apis. The mixure of mx.sym, mx.nd mx.gluon confused the users
> > of what is the right one to choose in order to get their model to work.
> As
> > an example, Although Gluon made data encapsulation possible, still there
> > are examples using mxn.io.ImageRecordIter with tens of params (feels like
> > gluon examples are simply the copy from old Python examples).
> >
> > 3. Examples again! Comparing to PyTorch, there are a few examples I don't
> > like in Gluon:
> > 1. Available to run however the code structure is still
> > very complicated. Such as example/image-classification/cifar10.py. It
> > seemed like a consecutive code concatenation. In fact, these are just a
> > series of layers mixed with model.fit. It makes user very hard to
> > modify/extend the model.
> > 2. Only available to run with certain settings. If users
> > try to change a little bit in the model, crashes will happen. For
> example,
> > the multi-gpu example in Gluon website, MXNet hide the logic that using
> > batch size to change learning rate in a optimizer. A lot of newbies
> didn't
> > know this fact and they would only find that the model stopped converging
> > when batch size changed.
> > 3. The worst scenario is the model itself just simply
> > didn't work. Maintainers in the MXNet community didn't run the model

RE: Remove MKLML as dependency

2018-09-20 Thread Lv, Tao A
Hi Chris, please kindly check the statements here: 
https://github.com/intel/mkl-dnn#installation 

" Intel MKL-DNN can take advantage of optimized matrix-matrix multiplication 
(GEMM) function from Intel MKL. The dynamic library with this functionality is 
included in the repository. "

" You can choose to build Intel MKL-DNN without binary dependency. The 
resulting version will be fully functional, however performance of certain 
convolution shapes and sizes and inner product relying on SGEMM function may be 
suboptimal."

-tao

-Original Message-
From: Chris Olivier [mailto:cjolivie...@gmail.com] 
Sent: Thursday, September 20, 2018 11:20 AM
To: dev@mxnet.incubator.apache.org
Subject: Re: Remove MKLML as dependency

maybe I missed it, but what does MKLML have that mkldnn doesn’t have that makes 
it necessary?

what’s the motivation for removing it?

On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A  wrote:

> If you just want to test the performance, I think you need link MKL 
> for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better 
> performance.
>
> Here are some ways for you to install full MKL library if you don't 
> have
> one:
> 1. Register and download from intel website:
> https://software.intel.com/en-us/mkl
> 2. Apt-get/yum: currently it need configure Intel’s repositories.
> a.
> https://software.intel.com/en-us/articles/installing-intel-free-libs-and-python-yum-repo
> b. https://software.intel.com/en-us/articles/
> thatinstalling-intel-free-libs-and-python-apt-repo
>  and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package has 
> the runtime and ‘mkl-devel’ includes everything with the headers
> a.
> https://software.intel.com/en-us/articles/installing-the-intel-distrib
> ution-for-python-and-intel-performance-libraries-with-pip-and
> 4. conda install: also has mkl and mkl-devel
> a. https://anaconda.org/intel/mkl
> b. https://anaconda.org/intel/mkl-devel
>
> If you want to redistribute MKL with MXNet, you may need take care of 
> the license issue. Currently, MKL is using ISSL ( 
> https://software.intel.com/en-us/license/intel-simplified-software-lic
> ense
> ).
>
> -Original Message-
> From: Zai, Alexander [mailto:alex...@amazon.com.INVALID]
> Sent: Wednesday, September 19, 2018 12:49 PM
> To: dev@mxnet.incubator.apache.org
> Subject: Re: Remove MKLML as dependency
>
> Will test it out tomorrow.
>
> On the side, what is the best way to test MKL build for MXnet. MKL is 
> licensed?
>
> Best,
> Alex
>
> On 9/18/18, 7:50 PM, "Lv, Tao A"  wrote:
>
> Hi Alex,
>
> Thanks for bringing this up.
>
> The original intention of MKLML is to provide a light and 
> easy-to-access library for ML/DL community. It's released with MKL-DNN 
> under Apache-2.0 license.
>
> AFAIK, MKL-DNN still relies on it for better performance. So I'm 
> afraid there will be a performance regression in MKL pip packages if 
> MKLML is simply removed.
>
> Have you ever tried the build without MKLML and how does the 
> performance look like?
>
> -tao
>
> -Original Message-
> From: Alex Zai [mailto:aza...@gmail.com]
> Sent: Wednesday, September 19, 2018 4:49 AM
> To: dev@mxnet.incubator.apache.org
> Subject: Remove MKLML as dependency
>
> On our build from source page we have a list of blas libraries 
> that are recommended:
> https://mxnet.incubator.apache.org/install/build_from_source.html
>
> MKL-DNN
> MKL
> MKLML
> Apple Accelerate
> OpenBlas
>
> MKLML is a subset of MKL (https://github.com/intel/mkl-dnn/issues/102)
> and therefore MKLML users can just use MKL instead. Does anyone 
> see an issue with me removing this? It would simplify out doc page and build 
> file.
>
> Alex
>
>
>