libjpegturbo

2019-02-11 Thread Per da Silva
Hello everyone,

I was wondering if there was any particular reason why we are building and
testing mxnet with USE_LIBJPEG_TURBO=0. I noticed that we are shipping it
with USE_LIBJPEG_TURBO=1 (eg. make/pip/pip_linux_cpu.mk).

I ran into issues trying to compile mxnet with the libjpegturbo flag on
Ubuntu 16.04 (I was wondering if this was the reason). This came from an
issue with libturbojpeg-dev package. There is a fix described on [1]. I've
applied it in a PR, which I'm currently testing [2].

Cheers,

Per

[1] https://github.com/HaxeFoundation/hashlink/issues/147
[2] https://github.com/apache/incubator-mxnet/pull/14127


Re: [RESTARTING][VOTE] Release Apache MXNet (incubating) version 1.4.0.rc2

2019-02-11 Thread Lin Yuan
+1 binding
Horovod is going to release it's 0.16.0 in the coming week with MXNet
integration. We need to release 1.4.0 which includes all the dependencies
for Horovod integration.

Best,

Lin

On Mon, Feb 11, 2019 at 9:30 PM Steffen Rochel 
wrote:

> Dear community -
> based on Justin's and community feedback I'm suggesting to restart the
> vote.
> Current status:
> binding votes:
> +1: 2 votes (Henri, Jason)
> -1:  1 vote (Luciano)
>
> non-binding:
> +1: 1 vote (Kellen)
>
> The community is investigating feedback from Luciano that the exclusion
> file is to broad and potentially missing files which can and must have
> apache license headers not to be checked.
>
> Regards,
> Steffen
>
>
>
>
> On Mon, Feb 11, 2019 at 10:08 AM Hagay Lupesko  wrote:
>
> > Based on Justin's feedback, can we resume the vote instead of cancelling
> > it?
> >
> > On Mon, Feb 11, 2019 at 12:02 AM Justin Mclean  >
> > wrote:
> >
> > > Hi,
> > >
> > > In future don’t be so hasty to cancel a release vote, people mind can
> be
> > > changed and a -1 is not a veto on a release.
> > >
> > > Thanks,
> > > Justin
> > >
> > >
> > > -
> > > To unsubscribe, e-mail: general-unsubscr...@incubator.apache.org
> > > For additional commands, e-mail: general-h...@incubator.apache.org
> > >
> > >
> >
>


[RESTARTING][VOTE] Release Apache MXNet (incubating) version 1.4.0.rc2

2019-02-11 Thread Steffen Rochel
Dear community -
based on Justin's and community feedback I'm suggesting to restart the vote.
Current status:
binding votes:
+1: 2 votes (Henri, Jason)
-1:  1 vote (Luciano)

non-binding:
+1: 1 vote (Kellen)

The community is investigating feedback from Luciano that the exclusion
file is to broad and potentially missing files which can and must have
apache license headers not to be checked.

Regards,
Steffen




On Mon, Feb 11, 2019 at 10:08 AM Hagay Lupesko  wrote:

> Based on Justin's feedback, can we resume the vote instead of cancelling
> it?
>
> On Mon, Feb 11, 2019 at 12:02 AM Justin Mclean 
> wrote:
>
> > Hi,
> >
> > In future don’t be so hasty to cancel a release vote, people mind can be
> > changed and a -1 is not a veto on a release.
> >
> > Thanks,
> > Justin
> >
> >
> > -
> > To unsubscribe, e-mail: general-unsubscr...@incubator.apache.org
> > For additional commands, e-mail: general-h...@incubator.apache.org
> >
> >
>


Re: [CANCELLED][VOTE] Release Apache MXNet (incubating) version 1.4.0.rc2

2019-02-11 Thread sandeep krishnamurthy
I do believe in the benefit of MXNet community, MXNet 1.4  is a important
release with many useful features for our users:

1. Java Inference API, JVM memory management, Julia APIs
2. Multiple important directional experimental features - Subgraph API,
control flow operators, Topology aware all-reduce approach in distributed
training
3. Enhancing so many user touching functionalities - ONNX operator
coverage, new operators like trigonometric operators, Debugging operators
4. 75+ bug fixes
5. Documentation, tutorials, examples updates and issue fixes

*RAT check failures is important and we will fix it up soon in the
following release*. Holding off the release will further delay so many
useful features for users through a stable release. In my opinion, it would
be beneficial for MXNet community to make this release happen.

Best,
Sandeep
On Mon, Feb 11, 2019 at 4:37 PM Qing Lan  wrote:

> Hi All,
>
> Can we move the VOTE forward since the RAT license should not be a problem
> that block the release. We can always add that one in our future releases
> (e.g 1.4.1 or 1.5.0).
>
> As you may aware, 1.4.0 release started very early this year and delayed a
> couple of times until now. From the Apache Release process:
> http://www.apache.org/legal/release-policy.html, it should be fine for us
> to move forward if majority provide +1 than -1.
>
> Again, move forward does not mean we should not fix it or ignore problems
> that exist in the code. We will address and fix them in the next release.
>
> Thanks,
> Qing
>
> On 2/10/19, 10:28 PM, "Steffen Rochel"  wrote:
>
> Dear community -
> I'm cancelling the vote due to -1 feedback from Luciano due to RAT
> failures.
> For details see
>
> https://lists.apache.org/thread.html/51e9ab05edae2089c74a253000a92d5aa5c6406f54e5bd0a0b3c3879@%3Cgeneral.incubator.apache.org%3E
>
> The MXNet community will discuss next steps.
>
> Regards,
> Steffen
>
>
>

-- 
Sandeep Krishnamurthy


Re: [CANCELLED][VOTE] Release Apache MXNet (incubating) version 1.4.0.rc2

2019-02-11 Thread Qing Lan
Hi All,

Can we move the VOTE forward since the RAT license should not be a problem that 
block the release. We can always add that one in our future releases (e.g 1.4.1 
or 1.5.0).

As you may aware, 1.4.0 release started very early this year and delayed a 
couple of times until now. From the Apache Release process: 
http://www.apache.org/legal/release-policy.html, it should be fine for us to 
move forward if majority provide +1 than -1. 

Again, move forward does not mean we should not fix it or ignore problems that 
exist in the code. We will address and fix them in the next release.

Thanks,
Qing

On 2/10/19, 10:28 PM, "Steffen Rochel"  wrote:

Dear community -
I'm cancelling the vote due to -1 feedback from Luciano due to RAT
failures.
For details see

https://lists.apache.org/thread.html/51e9ab05edae2089c74a253000a92d5aa5c6406f54e5bd0a0b3c3879@%3Cgeneral.incubator.apache.org%3E

The MXNet community will discuss next steps.

Regards,
Steffen




Re: Question about Gluon Api for Scala package & JVM langs

2019-02-11 Thread Carin Meier
I started a proposal page on it
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=103089990

It is a big chunk of work and needs some serious analysis - but it's a
starting point for a conversation :)

On Tue, Jan 22, 2019 at 1:56 PM Carin Meier  wrote:

> Thanks!
>
> I've heard this from our Clojure community so if anyone else would like to
> chime in, please feel free...
>
> One popular Deep Learning book out there is "Dive into Deep Learning"
> https://d2l.ai/ -  which has its examples in Gluon. It would be nice as a
> starting point in the discussion to see what is covered in there and the
> scope/effort it would be to build it out.
>
> I'll help draft a proposal in the next few days.
>
> - Carin
>
> On Tue, Jan 22, 2019 at 1:32 PM Qing Lan  wrote:
>
>> Hi Carin,
>>
>> Thanks for your question. I would like to know which part(s) of Gluon you
>> would like to see for JVM in general?
>> Since itself if big, we can start working on the components users needed
>> the most and then cover the rest of the part.
>> Please feel free to draft a proposal somewhere and we can discuss about
>> it.
>>
>> Thanks,
>> Qing
>>
>> On 1/21/19, 4:53 PM, "Carin Meier"  wrote:
>>
>> Currently the Scala package supports the Module and FeedForward APIs.
>> Since
>> quite a bit of the documentation is focusing now on the Gluon API, I
>> wondered if there were thoughts of bringing the Gluon interface to the
>> Scala package in the future for the JVM langs.
>>
>> - Carin
>>
>>
>>


Re: RE: Third-party package tests for MXNet nightly builds

2019-02-11 Thread Sheng Zha
Thanks for the proposal, Felix. On one hand, I agree that richer workload from 
the ecosystem helps find issues in MXNet early. On the other hand, I'm 
concerned about tightly coupling the development of projects.

Monitoring the upstream library and addressing problems for upgrading 
dependency should be the concern of the downstream projects. These projects own 
the effort of having proper testing for any changes needed, including version 
upgrade. Having these projects in MXNet CI means the responsibiliy of 
maintaining these projects partly transfers to the MXNet's contributors, which 
doesn't seem right. It blurs the line of who's responsible for debugging, 
isolating the problem, making minimum reproducible sample code, and posting the 
fix.

That said, I think there's much opportunity for reusing the current code for 
MXNet CI. Projects in MXNet's ecosystem would likely benefit from MXNet's CI 
solution so that each individual community project can identify issues early. 
(And from offline chats with Chance and his team members, I think this is 
what's already on their minds.)

-sz

On 2019/02/11 16:46:06, "Zhao, Patric"  wrote: 
> Agree to track the 3rd party packages which make MXNet more prosperous :)
> 
> Before building the CI, I suggest to create the related labels, like sockeye, 
> gluonCV, gluonNLP, etc, in the GitHub and give the high priority for these 
> issues/PR.
> So the issue/PR can be fixed quickly and  these important applications would 
> not be blocked again.
> 
> We can help for the performance/backend/operator related issues as well :)
> 
> Thanks,
> 
> --Patric 
> 
> 
> 
> > -Original Message-
> > From: Chance Bair [mailto:chanceb...@gmail.com]
> > Sent: Monday, February 11, 2019 11:28 PM
> > To: dev@mxnet.incubator.apache.org
> > Cc: d...@mxnet.apache.org
> > Subject: Re: Third-party package tests for MXNet nightly builds
> > 
> > Hi Felix,
> > 
> > Thank you for the request!  The CI team is currently working on improving
> > our benchmarking platform and will evaluate this request carefully.
> > 
> > Chance Bair
> > 
> > 
> > 
> > On Mon, Feb 11, 2019 at 3:59 PM Carin Meier 
> > wrote:
> > 
> > > Can't speak for the CI team, but in general I think that it is good idea.
> > >
> > > On a separate note, I've been playing around with Sockeye recently and
> > > it's great! Awesome work and glad to see MXNet used for such cutting
> > > edge use cases.
> > > I'd love to see closer collaboration with the Sockeye team and MXNet
> > > for innovation, cross pollination, and evangelization of what MXNet can
> > do .
> > >
> > > Best,
> > > Carin
> > >
> > > On Mon, Feb 11, 2019 at 6:01 AM Felix Hieber 
> > > wrote:
> > >
> > > > Hello dev@,
> > > >
> > > >
> > > >
> > > > I would like to ask around whether there is interest in the
> > > > community to test nightly builds of MXNet with third-party packages
> > > > that depend on
> > > MXNet
> > > > and act as early adopters. The goal is to catch regressions in MXNet
> > > early,
> > > > allowing time for bug fixes before a new release is cut.
> > > >
> > > >
> > > >
> > > > For example, Sockeye  is a
> > > > customer
> > > of
> > > > new MXNet releases and aims to upgrade to latest MXNet as soon as
> > > possible.
> > > > Typically, we update our dependency on MXNet once a new release
> > > > becomes available (through pip). However, there have been cases
> > > > where new
> > > releases
> > > > of MXNet introduced regressions undetected by MXNet tests (hence
> > > > passing the release process): the latest example is this issue
> > > > , which may
> > > > have been introduced already back in October, but, due to infrequent
> > > > MXNet releases, has only surfaced recently and will most likely
> > > > force us to
> > > wait
> > > > for a post or 1.4.1 release. In this particular example, Sockeye’s
> > > > tests would have detected this, and the issue could have been
> > > > created already
> > > in
> > > > October, potentially avoiding its presence in the 1.4.0 release.
> > > >
> > > >
> > > >
> > > > More generally, I think there are several third-party packages with
> > > > valuable test suites (e.g. gluon-nlp) that can contribute to
> > > > catching
> > > MXNet
> > > > regressions or incompatibilities early. Running these test suites
> > > > for
> > > each
> > > > and every PR or commit on the MXNet main repo would be too much
> > overhead.
> > > > My proposal would be to trigger these tests with the nightly builds
> > > > (pip
> > > > releases) of MXNet in a separate CI pipeline that is able to notify
> > > > the
> > > 3p
> > > > maintainers in a case of failure, but does not block MXNet
> > > > development
> > > (or
> > > > nightly build releases) in any way.
> > > >
> > > > Roughly it would do the following:
> > > >
> > > >- pip install mxnet--
> > > >- for each 3p package that is part of the pipeline:
> > > >   - 

Re: [RESULTS][VOTE] Release Apache MXNet (incubating) version 1.4.0.rc2

2019-02-11 Thread Sheng Zha
Update on the issue 1. and 4.:
For 1., I fixed the notice year in master branch [1]. If we are to create a new 
rc, the fix should be cherry-picked.
For 4., MKLDNN has found the issue [2] and posted the fix in their master 
branch. I'm requesting that the fix be backported for the minor version 0.17 
that mxnet 1.4 is using.

-sz

[1] https://github.com/apache/incubator-mxnet/pull/14043
[2] https://github.com/intel/mkl-dnn/issues/405#issuecomment-462400456

On 2019/02/05 04:41:32, Steffen Rochel  wrote: 
> Dear MXNet community -
> the result of the vote to release Apache MXNet (incubating) version
> 1.4.0.rc2 are as follows:
> Binding:
> +1  three (Carin, Indhu, Haibin)
> +0  one (Sheng)
> -0   one (Anirudh)
> -1   none
> 
> Non-binding:
> +1  six   (Yuxi, Aston, Kellen, Aaron, Tao, Lin)
> 0 none
> -1 none
> 
> Voting thread:
> 
> https://lists.apache.org/thread.html/5d4aa084e51e9be919d62bfd0e6d625f37318624124a033a5c48507c@%3Cdev.mxnet.apache.org%3E
> 
> 
> The following issues have been raised with v1.4.0.rc2:
> 1. NOTICE year is wrong (2018): Not considered a stopping issue as release
> was started in 2018.
> 2. TVM NOTICE missing - TVM NOTICE file was added post the commit ID used
> in MXNet v1.4.0.rc2 release, not considered a stopping issue
> 3. build with make passes, but build with cmake failed in
> 3rdparty/dmlc-core/test/unittest
> 4. Recent MKLDNN upgrade prevents us from offering binary distribution for
> earlier versions before OSX 10.13.
> 
> The vote results meet the release voting criteria as defined at
> https://www.apache.org/foundation/voting.html#ReleaseVotes: 3 +1 binding
> votes, no -1, more positive then negative votes.
> I'm not sure there is a difference between -0 and +0 votes, but even if
> there is a difference there are more positive vs. negative votes.
> 
> I do consider the issues raised not as show stoppers to move forward with
> the release. I do suggest to get these issues addressed in the next release
> or with a patch on version 1.4.0.
> To give everybody a chance to way into my decision as release manger, I
> will wait until Wednesday 9am PST (about 36h from now) before starting vote
> on general list.
> Please speak up asap if you think the release cannot move forward as is and
> provide justification.
> 
> Regards,
> Steffen
> 


Re: Gluon fit API- Design proposal

2019-02-11 Thread Tommy Pujol
STOP

On Sun, Feb 10, 2019 at 10:43 PM Hagay Lupesko  wrote:

> Wanted to chime in as well.
> I have reviewed the design shared in the mail offline with Ankit, Lai and
> Naveen (we work in the same team in Amazon).
>
> I think it does a good job at simplifying many low-complexity training use
> cases, which can make MXNet/Gluon even more friendly to so-called "deep
> learning beginners" - so +1 on the proposal!
>
> Hagay
>
> On Fri, Feb 8, 2019 at 10:30 AM Naveen Swamy  wrote:
>
> > Hi Alfredo,
> > Thanks for your comments, I really like all your suggestions. Here are my
> > answers let me know if it makes sense or have comments.
> >
> > 1) The fit API is targeting novice users covering about 80% of the use
> > cases listed in the document. For advanced users,
> > and complex models, we (Naveen, Ankit and Lai) felt its best use the
> > existing mechanisms due to the imperative nature and the more control it
> > can give, So we did not duplicate the save/load functionality in the
> Hybrid
> > block.
> > We’ll consider and extend the functionality to Estimator.
> > I have had trouble using pickle package which is commonly used for
> > serialization and deserialization, if you have any other suggestions from
> > your experience please let us know.
> >
> > 2) +1, we’ll add this to our backlog and add it in our next iteration.
> >
> > 3) Can you expand a little more on this, how it helps in a production
> > environment (which this API was not target for) ?.
> > I’ll check the TF Estimator to understand further.
> >
> > Thanks, Naveen
> >
> >
> > On Thu, Feb 7, 2019 at 2:32 PM Alfredo Luque
> >  wrote:
> >
> > > This is great and something we should all be able to benefit from.
> > >
> > > There are just three pieces I’d like to advocate for that I feel are
> > > shortcomings of some competing APIs on other frameworks (eg; TF
> > Estimators)
> > > and I would love to see in this proposal:
> > >
> > > 1) Make serialization/deserialization of these classifiers/regressors
> > easy
> > > or at least ensure the internal members of the wrapper are easy to
> > > save/load. We’ve hacked around this by only allowing hybrid blocks
> which
> > > have easy save/load functionality, but having a simple
> > > “save_model”/“load_model” function as a 1st class citizen of these
> > proposed
> > > APIs will lead to a vastly improved user experience down the road.
> > >
> > > 2) Allowing the fit/predict/predict_proba functions to take in both
> data
> > > loaders and simple numpy arrays and pandas dataframes is a simple
> change
> > > but a huge usability improvement. Power users and library authors will
> > > appreciate being able to use custom data loaders but a large portion of
> > end
> > > users want to just pass an ndarray or data frame and get some results
> > > quickly.
> > >
> > > 3) Allow lazy construction of the model. This is something I feel TF
> > > Estimators do well: by allowing the user to pass a function that
> > constructs
> > > the net (i.e a model_fn that returns the net) rather than the net
> itself
> > it
> > > allows for more control at runtime and usage of these APIs in a
> > production
> > > environment.
> > >
> > > Would love your thoughts on these three changes/additions.
> > >
> > > —Alfredo Luque
> > > Software Engineer
> > > Machine Learning Infrastructure
> > > Airbnb
> > > San Francisco, CA
> > >
> > > On February 7, 2019 at 1:51:17 PM, Ankit Khedia (
> khedia.an...@gmail.com)
> > > wrote:
> > >
> > > Hello dev@,
> > >
> > > Training a model in Gluon requires users to write the training loop,
> this
> > > is useful because of its imperative nature, however repeating the same
> > code
> > > across multiple models can become tedious and repetitive with
> boilerplate
> > > code. The training loop can also be overwhelming to some users new to
> > deep
> > > learning. Users have asked in [1] for a simple Fit API, similar to APIs
> > > available in SKLearn and Keras as a way to simplify model training and
> > > reduce boilerplate code and complexity.
> > >
> > > So, I along with other contributor Naveen and Lai came up with a fit
> API
> > > proposal in [2] that covers 80% of the use-cases for beginners, the fit
> > API
> > > does not replace the gluon training loops. The API proposal is inspired
> > by
> > > the Keras fit API. I have discussed and got feedback from a few MXNet
> > > contributors (Sheng, Mu, Aston, Zhi) close by and I am writing to ask
> for
> > > the community’s feedback on the API proposal.
> > >
> > >
> > >
> > > [1]
> > >
> >
> https://discuss.mxnet.io/t/wrapping-gluon-into-scikit-learn-like-api/2112
> > > [2]
> > >
> > >
> >
> https://cwiki.apache.org/confluence/display/MXNET/Gluon+Fit+API+-+Tech+Design
> > >
> > >
> > > Thanks,
> > > Ankit
> > >
> > >
> > > —
> > > Alfredo Luque
> > > Software Engineer
> > > Machine Learning Infrastructure
> > > Airbnb
> > > San Francisco, CA
> > >
> >
>


-- 
Best regards,

Tommy Pujol


RE: Third-party package tests for MXNet nightly builds

2019-02-11 Thread Zhao, Patric
Agree to track the 3rd party packages which make MXNet more prosperous :)

Before building the CI, I suggest to create the related labels, like sockeye, 
gluonCV, gluonNLP, etc, in the GitHub and give the high priority for these 
issues/PR.
So the issue/PR can be fixed quickly and  these important applications would 
not be blocked again.

We can help for the performance/backend/operator related issues as well :)

Thanks,

--Patric 



> -Original Message-
> From: Chance Bair [mailto:chanceb...@gmail.com]
> Sent: Monday, February 11, 2019 11:28 PM
> To: dev@mxnet.incubator.apache.org
> Cc: d...@mxnet.apache.org
> Subject: Re: Third-party package tests for MXNet nightly builds
> 
> Hi Felix,
> 
> Thank you for the request!  The CI team is currently working on improving
> our benchmarking platform and will evaluate this request carefully.
> 
> Chance Bair
> 
> 
> 
> On Mon, Feb 11, 2019 at 3:59 PM Carin Meier 
> wrote:
> 
> > Can't speak for the CI team, but in general I think that it is good idea.
> >
> > On a separate note, I've been playing around with Sockeye recently and
> > it's great! Awesome work and glad to see MXNet used for such cutting
> > edge use cases.
> > I'd love to see closer collaboration with the Sockeye team and MXNet
> > for innovation, cross pollination, and evangelization of what MXNet can
> do .
> >
> > Best,
> > Carin
> >
> > On Mon, Feb 11, 2019 at 6:01 AM Felix Hieber 
> > wrote:
> >
> > > Hello dev@,
> > >
> > >
> > >
> > > I would like to ask around whether there is interest in the
> > > community to test nightly builds of MXNet with third-party packages
> > > that depend on
> > MXNet
> > > and act as early adopters. The goal is to catch regressions in MXNet
> > early,
> > > allowing time for bug fixes before a new release is cut.
> > >
> > >
> > >
> > > For example, Sockeye  is a
> > > customer
> > of
> > > new MXNet releases and aims to upgrade to latest MXNet as soon as
> > possible.
> > > Typically, we update our dependency on MXNet once a new release
> > > becomes available (through pip). However, there have been cases
> > > where new
> > releases
> > > of MXNet introduced regressions undetected by MXNet tests (hence
> > > passing the release process): the latest example is this issue
> > > , which may
> > > have been introduced already back in October, but, due to infrequent
> > > MXNet releases, has only surfaced recently and will most likely
> > > force us to
> > wait
> > > for a post or 1.4.1 release. In this particular example, Sockeye’s
> > > tests would have detected this, and the issue could have been
> > > created already
> > in
> > > October, potentially avoiding its presence in the 1.4.0 release.
> > >
> > >
> > >
> > > More generally, I think there are several third-party packages with
> > > valuable test suites (e.g. gluon-nlp) that can contribute to
> > > catching
> > MXNet
> > > regressions or incompatibilities early. Running these test suites
> > > for
> > each
> > > and every PR or commit on the MXNet main repo would be too much
> overhead.
> > > My proposal would be to trigger these tests with the nightly builds
> > > (pip
> > > releases) of MXNet in a separate CI pipeline that is able to notify
> > > the
> > 3p
> > > maintainers in a case of failure, but does not block MXNet
> > > development
> > (or
> > > nightly build releases) in any way.
> > >
> > > Roughly it would do the following:
> > >
> > >- pip install mxnet--
> > >- for each 3p package that is part of the pipeline:
> > >   - clone/setup up package
> > >   - run unit/integration tests of package with some timeout
> > >   - in case of failure, notify package owner
> > >
> > >
> > >
> > > I am not familiar with the current CI pipelines, their requirements
> > > and resources. It would be great if someone from the CI team could
> > > chime in
> > and
> > > evaluate whether such a proposal seems doable and worthwhile.
> > >
> > >
> > >
> > > Best,
> > >
> > > Felix
> > >
> >


Re: Third-party package tests for MXNet nightly builds

2019-02-11 Thread Chance Bair
Hi Felix,

Thank you for the request!  The CI team is currently working on improving
our benchmarking platform and will evaluate this request carefully.

Chance Bair



On Mon, Feb 11, 2019 at 3:59 PM Carin Meier  wrote:

> Can't speak for the CI team, but in general I think that it is good idea.
>
> On a separate note, I've been playing around with Sockeye recently and it's
> great! Awesome work and glad to see MXNet used for such cutting edge use
> cases.
> I'd love to see closer collaboration with the Sockeye team and MXNet for
> innovation, cross pollination, and evangelization of what MXNet can do .
>
> Best,
> Carin
>
> On Mon, Feb 11, 2019 at 6:01 AM Felix Hieber 
> wrote:
>
> > Hello dev@,
> >
> >
> >
> > I would like to ask around whether there is interest in the community to
> > test nightly builds of MXNet with third-party packages that depend on
> MXNet
> > and act as early adopters. The goal is to catch regressions in MXNet
> early,
> > allowing time for bug fixes before a new release is cut.
> >
> >
> >
> > For example, Sockeye  is a customer
> of
> > new MXNet releases and aims to upgrade to latest MXNet as soon as
> possible.
> > Typically, we update our dependency on MXNet once a new release becomes
> > available (through pip). However, there have been cases where new
> releases
> > of MXNet introduced regressions undetected by MXNet tests (hence passing
> > the release process): the latest example is this issue
> > , which may have
> > been introduced already back in October, but, due to infrequent MXNet
> > releases, has only surfaced recently and will most likely force us to
> wait
> > for a post or 1.4.1 release. In this particular example, Sockeye’s tests
> > would have detected this, and the issue could have been created already
> in
> > October, potentially avoiding its presence in the 1.4.0 release.
> >
> >
> >
> > More generally, I think there are several third-party packages with
> > valuable test suites (e.g. gluon-nlp) that can contribute to catching
> MXNet
> > regressions or incompatibilities early. Running these test suites for
> each
> > and every PR or commit on the MXNet main repo would be too much overhead.
> > My proposal would be to trigger these tests with the nightly builds (pip
> > releases) of MXNet in a separate CI pipeline that is able to notify the
> 3p
> > maintainers in a case of failure, but does not block MXNet development
> (or
> > nightly build releases) in any way.
> >
> > Roughly it would do the following:
> >
> >- pip install mxnet--
> >- for each 3p package that is part of the pipeline:
> >   - clone/setup up package
> >   - run unit/integration tests of package with some timeout
> >   - in case of failure, notify package owner
> >
> >
> >
> > I am not familiar with the current CI pipelines, their requirements and
> > resources. It would be great if someone from the CI team could chime in
> and
> > evaluate whether such a proposal seems doable and worthwhile.
> >
> >
> >
> > Best,
> >
> > Felix
> >
>


Re: Third-party package tests for MXNet nightly builds

2019-02-11 Thread Carin Meier
Can't speak for the CI team, but in general I think that it is good idea.

On a separate note, I've been playing around with Sockeye recently and it's
great! Awesome work and glad to see MXNet used for such cutting edge use
cases.
I'd love to see closer collaboration with the Sockeye team and MXNet for
innovation, cross pollination, and evangelization of what MXNet can do .

Best,
Carin

On Mon, Feb 11, 2019 at 6:01 AM Felix Hieber  wrote:

> Hello dev@,
>
>
>
> I would like to ask around whether there is interest in the community to
> test nightly builds of MXNet with third-party packages that depend on MXNet
> and act as early adopters. The goal is to catch regressions in MXNet early,
> allowing time for bug fixes before a new release is cut.
>
>
>
> For example, Sockeye  is a customer of
> new MXNet releases and aims to upgrade to latest MXNet as soon as possible.
> Typically, we update our dependency on MXNet once a new release becomes
> available (through pip). However, there have been cases where new releases
> of MXNet introduced regressions undetected by MXNet tests (hence passing
> the release process): the latest example is this issue
> , which may have
> been introduced already back in October, but, due to infrequent MXNet
> releases, has only surfaced recently and will most likely force us to wait
> for a post or 1.4.1 release. In this particular example, Sockeye’s tests
> would have detected this, and the issue could have been created already in
> October, potentially avoiding its presence in the 1.4.0 release.
>
>
>
> More generally, I think there are several third-party packages with
> valuable test suites (e.g. gluon-nlp) that can contribute to catching MXNet
> regressions or incompatibilities early. Running these test suites for each
> and every PR or commit on the MXNet main repo would be too much overhead.
> My proposal would be to trigger these tests with the nightly builds (pip
> releases) of MXNet in a separate CI pipeline that is able to notify the 3p
> maintainers in a case of failure, but does not block MXNet development (or
> nightly build releases) in any way.
>
> Roughly it would do the following:
>
>- pip install mxnet--
>- for each 3p package that is part of the pipeline:
>   - clone/setup up package
>   - run unit/integration tests of package with some timeout
>   - in case of failure, notify package owner
>
>
>
> I am not familiar with the current CI pipelines, their requirements and
> resources. It would be great if someone from the CI team could chime in and
> evaluate whether such a proposal seems doable and worthwhile.
>
>
>
> Best,
>
> Felix
>


Re: First class support for MxNet?

2019-02-11 Thread Carin Meier
+100 on Iblis's thoughts:

"We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at."

- Carin


On Mon, Feb 11, 2019 at 9:08 AM iblis  wrote:

> well, I'm not going to talk about technical stuffs.
> You can find some design concepts on doc or wiki.
> (
> https://mxnet.incubator.apache.org/versions/master/architecture/index.html
> )
>
> For me, working on MXNet is a rare chance to verify my ideas of
> a machine learning framework.
> During implementing MXNet Julia package, I can explicitly compare the
> experience of MXNet with Flux's
> ...and than start to complaining about them. :p
> I think a way to moving forward is comparison.
> So that's why I said I want to increase the diversity of DL tools in Julia.
>
> I like the spirit of portability in MXNet community.
> We welcomed all of language packages and open-minded.
> Although some of languages might be considered not popular in ML/DL,
> this community still keep polishing them day in day out.
> Yeah, someone has to try it, compare and gain experience from this
> process regardless of how the language has been evaluated in ML.
> The experience is valuable.
> (e.g. I think lack of function overloading is a disadvantage
>   of Python; the file-based namespace does help for maintainability
>   in Python.
>   After I did some works in Julia, I can clearly point out pros and cons.)
>
>  From a long-term view... maybe twenty years after,
> none of the languages we are using now will be popular.
> But I believe the meta-rules which extracted from experiences are still
> applied.
>
> So.. why not have a Rust lib? maybe Rust's macro can do something crazy,
> maybe.
> e.g. Julia package shows a more elegant way to stack a network than Python,
> thanks to metaprogramming.
>
>mlp = @mx.chain mx.Variable(:data) =>
>  mx.FullyConnected(name=:fc1, num_hidden=128) =>
>  mx.Activation(name=:relu1, act_type=:relu)   =>
>  mx.FullyConnected(name=:fc2, num_hidden=64)  =>
>  mx.Activation(name=:relu2, act_type=:relu)   =>
>  mx.FullyConnected(name=:fc3, num_hidden=10)  =>
>  mx.SoftmaxOutput(name=:softmax)
>
>
> > Wondering where that leaves MxNet...
>
> Actually, I don't case about this issue.
> We know tools and frameworks keep changing.
> People learn the lesson from making and attempting.
> It's just the path of the human technology evolution.
> The point is the ideas/experiences
> which this community is going to surprise you at.
>
>
> Iblis Lin
> 林峻頤
>
> On 2/11/19 12:04 PM, Zach Boldyga wrote:
> > Those are compelling points! There's also another more recent follow-up
> > from the Julia team:
> https://julialang.org/blog/2018/12/ml-language-compiler
> > .
> >
> > It seems that Julia will likely have it's place in ML regardless of how
> > other tools progress; the latest offerings from Julia/Flux are really
> > compelling.
> >
> > Wondering where that leaves MxNet...
> >
> > Zach Boldyga
> > Scalabull  |  Founder
> > 1 (866) 846-8771 x 101
> >
>


Re: First class support for MxNet?

2019-02-11 Thread iblis

well, I'm not going to talk about technical stuffs.
You can find some design concepts on doc or wiki.
(https://mxnet.incubator.apache.org/versions/master/architecture/index.html)

For me, working on MXNet is a rare chance to verify my ideas of
a machine learning framework.
During implementing MXNet Julia package, I can explicitly compare the
experience of MXNet with Flux's
...and than start to complaining about them. :p
I think a way to moving forward is comparison.
So that's why I said I want to increase the diversity of DL tools in Julia.

I like the spirit of portability in MXNet community.
We welcomed all of language packages and open-minded.
Although some of languages might be considered not popular in ML/DL,
this community still keep polishing them day in day out.
Yeah, someone has to try it, compare and gain experience from this
process regardless of how the language has been evaluated in ML.
The experience is valuable.
(e.g. I think lack of function overloading is a disadvantage
 of Python; the file-based namespace does help for maintainability
 in Python.
 After I did some works in Julia, I can clearly point out pros and cons.)

From a long-term view... maybe twenty years after,
none of the languages we are using now will be popular.
But I believe the meta-rules which extracted from experiences are still applied.

So.. why not have a Rust lib? maybe Rust's macro can do something crazy, maybe.
e.g. Julia package shows a more elegant way to stack a network than Python,
thanks to metaprogramming.

  mlp = @mx.chain mx.Variable(:data) =>
mx.FullyConnected(name=:fc1, num_hidden=128) =>
mx.Activation(name=:relu1, act_type=:relu)   =>
mx.FullyConnected(name=:fc2, num_hidden=64)  =>
mx.Activation(name=:relu2, act_type=:relu)   =>
mx.FullyConnected(name=:fc3, num_hidden=10)  =>
mx.SoftmaxOutput(name=:softmax)



Wondering where that leaves MxNet...


Actually, I don't case about this issue.
We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at.


Iblis Lin
林峻頤

On 2/11/19 12:04 PM, Zach Boldyga wrote:

Those are compelling points! There's also another more recent follow-up
from the Julia team:https://julialang.org/blog/2018/12/ml-language-compiler
.

It seems that Julia will likely have it's place in ML regardless of how
other tools progress; the latest offerings from Julia/Flux are really
compelling.

Wondering where that leaves MxNet...

Zach Boldyga
Scalabull  |  Founder
1 (866) 846-8771 x 101



Third-party package tests for MXNet nightly builds

2019-02-11 Thread Felix Hieber
Hello dev@,



I would like to ask around whether there is interest in the community to
test nightly builds of MXNet with third-party packages that depend on MXNet
and act as early adopters. The goal is to catch regressions in MXNet early,
allowing time for bug fixes before a new release is cut.



For example, Sockeye  is a customer of
new MXNet releases and aims to upgrade to latest MXNet as soon as possible.
Typically, we update our dependency on MXNet once a new release becomes
available (through pip). However, there have been cases where new releases
of MXNet introduced regressions undetected by MXNet tests (hence passing
the release process): the latest example is this issue
, which may have
been introduced already back in October, but, due to infrequent MXNet
releases, has only surfaced recently and will most likely force us to wait
for a post or 1.4.1 release. In this particular example, Sockeye’s tests
would have detected this, and the issue could have been created already in
October, potentially avoiding its presence in the 1.4.0 release.



More generally, I think there are several third-party packages with
valuable test suites (e.g. gluon-nlp) that can contribute to catching MXNet
regressions or incompatibilities early. Running these test suites for each
and every PR or commit on the MXNet main repo would be too much overhead.
My proposal would be to trigger these tests with the nightly builds (pip
releases) of MXNet in a separate CI pipeline that is able to notify the 3p
maintainers in a case of failure, but does not block MXNet development (or
nightly build releases) in any way.

Roughly it would do the following:

   - pip install mxnet--
   - for each 3p package that is part of the pipeline:
  - clone/setup up package
  - run unit/integration tests of package with some timeout
  - in case of failure, notify package owner



I am not familiar with the current CI pipelines, their requirements and
resources. It would be great if someone from the CI team could chime in and
evaluate whether such a proposal seems doable and worthwhile.



Best,

Felix