Thanks Kellen for the explanation, +1 for this! On Sun, Apr 7, 2019 at 6:16 PM Zhao, Patric <patric.z...@intel.com> wrote:
> Agree. > > Recently, we (Tao, Shufan, Pengxin) are trying to integrate the Intel MKL > math functions into mshadow and MXNet. > We have to go through two repos and make lots of tradeoff between them. > If we can move mshadow into MXNet, it will be more flexible to redesign > and refactor parts of legacy code. > > > -----Original Message----- > > From: Sheng Zha [mailto:zhash...@apache.org] > > Sent: Monday, April 8, 2019 5:48 AM > > To: d...@mxnet.apache.org > > Subject: Re: assimilation of mshadow into the MXNet codebase > > > > mshadow depends on *a* BLAS library, and there's nothing inherent in > > mshadow code base that requires OpenBLAS over MKL. The linked issue > > #11769 seems to be more of a build logic issue. > > > > -sz > > > > On 2019/04/07 18:56:43, Aaron Markham <aaron.s.mark...@gmail.com> > > wrote: > > > +1 > > > Reduced complexity. Choice of math library... Hopefully you can just > > > install MKL and not be forced into mshadow's dependency on OpenBLAS. > > > This could make Windows setup easier. > > > Maybe this issue will get fixed: #11769. > > > > > > On Sun, Apr 7, 2019, 00:51 Junru Shao <junrushao1...@gmail.com> wrote: > > > > > > > Does merging mshadow into mxnet bring any actual benefit for > > > > customers in sense of performance, portability, or anything else? > > > > > > > > On Fri, Apr 5, 2019 at 9:38 PM Tianqi Chen > > > > <tqc...@cs.washington.edu> > > > > wrote: > > > > > > > > > Technically, mshadow is sufficient for MXNet. Adopting other > > > > > libraries ( eigen or xtensor) will unnecessarily increase the > > > > > codebase complexity without any additional gains. > > > > > > > > > > Given that mshadow is only used by mxnet. I do support donating it > > > > > into mxnet codebase. > > > > > To respect the original mshadow community. I would recommend > > > > > starting a community RFC In the mshadow github issue for a week, > > > > > before we start the migrating process. > > > > > Also, I would recommend a rebase merge just like the case of > > > > > MXNet.jl > > > > code > > > > > base to preserve the contribution history. > > > > > > > > > > Tianqi > > > > > > > > > > > > > > > On Fri, Apr 5, 2019 at 9:25 PM Alfredo Luque > > > > > <alfredo.lu...@airbnb.com.invalid> wrote: > > > > > > > > > > > Do you have a link to both of these proposals? > > > > > > > > > > > > On Fri, Apr 5, 2019 at 20:14 Anirudh Acharya > > > > > > <anirudhk...@gmail.com> > > > > > > wrote: > > > > > > > > > > > > > Hi Pedro, > > > > > > > > > > > > > > mshadow is mostly used for tensor arithmetic. There have been > > > > > discussions > > > > > > > about including it within mxnet. I think it is a good idea. > > > > > > > > > > > > > > As a more long term solution using libraries like eigen to > > > > > > > perform > > > > > linear > > > > > > > algebra operations was also suggested by anirudh2290@. I think > > > > > xtensor( > > > > > > > https://github.com/QuantStack/xtensor ) can also be a > > > > > > > candidate > > > > here. > > > > > > > > > > > > > > - > > > > > > > Anirudh > > > > > > > > > > > > > > > > > > > > > On Fri, Apr 5, 2019 at 7:03 PM Pedro Larroy < > > > > > > pedro.larroy.li...@gmail.com> > > > > > > > wrote: > > > > > > > > > > > > > > > Hi > > > > > > > > > > > > > > > > Some developers have noticed that working in mshadow is > > > > > > > > cumbersome > > > > as > > > > > > > > it's a 3rdparty subrepo. > > > > > > > > > > > > > > > > Since mshadow is a bunch of headers which don't have much of > > > > > > > > independent tests / library functionality, me and other > > > > > > > > developers believe that it would be good to assimilate this > > > > > > > > code in the repository for ease of contribution and changes > > > > > > > > without having to > > > > go > > > > > > > > trough contortions to test PRs that modify mshadow. > > > > > > > > > > > > > > > > Would anybody oppose this change? > > > > > > > > > > > > > > > > Thanks and have a nice weekend. > > > > > > > > > > > > > > > > Pedro. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > >