On Wed, Dec 4, 2013 at 5:05 PM, Chris Barker - NOAA Federal <
chris.bar...@noaa.gov> wrote:

> Ralf,
>
> Great to have you on this thread!
>
> Note: supporting "variants" on one way or another is a great idea, but for
> right now, maybe we can get pretty far without it.
>
> There are options for "serious" scipy users that need optimum performance,
> and newbies that want the full stack.
>
> So our primary audience for "default" installs and pypi wheels are folks
> that need the core packages ( maybe a web dev that wants some MPL plots)
> and need things to "just work" more than anything optimized.
>

The problem is explaining to people what they want - no one reads docs
before grabbing a binary. On the other hand, using wheels does solve the
issue that people download 32-bit installers for 64-bit Windows systems.


> So a lowest common denominator wheel would be very, very, useful.
>
> As for what that would be: the superpack is great, but it's been around a
> while (long while in computer years)
>
> How many non-sse machines are there still out there? How many non-sse2?
>

Hard to tell. Probably <2%, but that's still too much. Some older Athlon
XPs don't have it for example. And what if someone submits performance
optimizations (there has been a focus on those recently) to numpy that use
SSE4 or AVX for example? You don't want to reject those based on the
limitations of your distribution process.

And how big is the performance boost anyway?
>

Large. For a long time we've put a non-SSE installer for numpy on pypi so
that people would stop complaining that ``easy_install numpy`` didn't work.
Then there were regular complaints about dot products being an order of
magnitude slower than Matlab or R.

What I'm getting at is that we may well be able to build a reasonable win32
> binary wheel that we can put up on pypi right now, with currently available
> tools.
>
> Then MPL and pandas and I python...
>
> Scipy is trickier-- what with the Fortran and all, but I think we could do
> Win32 anyway.
>
> And what's the hold up with win64? Is that fortran and scipy? If so, then
> why not do win64 for the rest of the stack?
>

Yes, 64-bit MinGW + gfortran doesn't yet work (no place to install dlls
from the binary, long story). A few people including David C are working on
this issue right now. Visual Studio + Intel Fortran would work, but going
with only an expensive toolset like that is kind of a no-go - especially
since I think you'd force everyone else that builds other Fortran
extensions to then also use the same toolset.

(I, for one, have been a heavy numpy user since the Numeric days, and I
> still hardly use scipy)
>
> By the way, we can/should do OS-X too-- it seems easier in fact (fewer
> hardware options to support, and the Mac's universal binaries)
>
> -Chris
>
> Note on OS-X :  how long has it been since Apple shipped a 32 bit machine?
> Can we dump default 32 bit support? I'm pretty sure we don't need to do PPC
> anymore...
>

I'd like to, but we decided to ship the exact same set of binaries as
python.org - which means compiling on OS X 10.5/10.6 and including PPC +
32-bit Intel.

Ralf


>
> On Dec 3, 2013, at 11:40 PM, Ralf Gommers <ralf.gomm...@gmail.com> wrote:
>
>
>
>
> On Wed, Dec 4, 2013 at 1:54 AM, Donald Stufft <don...@stufft.io> wrote:
>
>>
>> On Dec 3, 2013, at 7:36 PM, Oscar Benjamin <oscar.j.benja...@gmail.com>
>> wrote:
>>
>> > On 3 December 2013 21:13, Donald Stufft <don...@stufft.io> wrote:
>> >> I think Wheels are the way forward for Python dependencies. Perhaps
>> not for
>> >> things like fortran. I hope that the scientific community can start
>> >> publishing wheels at least in addition too.
>> >
>> > The Fortran issue is not that complicated. Very few packages are
>> > affected by it. It can easily be fixed with some kind of compatibility
>> > tag that can be used by the small number of affected packages.
>> >
>> >> I don't believe that Conda will gain the mindshare that pip has
>> outside of
>> >> the scientific community so I hope we don't end up with two systems
>> that
>> >> can't interoperate.
>> >
>> > Maybe conda won't gain mindshare outside the scientific community but
>> > wheel really needs to gain mindshare *within* the scientific
>> > community. The root of all this is numpy. It is the biggest dependency
>> > on PyPI, is hard to build well, and has the Fortran ABI issue. It is
>> > used by very many people who wouldn't consider themselves part of the
>> > "scientific community". For example matplotlib depends on it. The PyPy
>> > devs have decided that it's so crucial to the success of PyPy that
>> > numpy's basically being rewritten in their stdlib (along with the C
>> > API).
>> >
>> > A few times I've seen Paul Moore refer to numpy as the "litmus test"
>> > for wheels. I actually think that it's more important than that. If
>> > wheels are going to fly then there *needs* to be wheels for numpy. As
>> > long as there isn't a wheel for numpy then there will be lots of
>> > people looking for a non-pip/PyPI solution to their needs.
>> >
>> > One way of getting the scientific community more on board here would
>> > be to offer them some tangible advantages. So rather than saying "oh
>> > well scientific use is a special case so they should just use conda or
>> > something", the message should be "the wheel system provides solutions
>> > to many long-standing problems and is even better than conda in (at
>> > least) some ways because it cleanly solves the Fortran ABI issue for
>> > example".
>> >
>> >
>> > Oscar
>>
>> I’d love to get Wheels to the point they are more suitable then they are
>> for
>> SciPy stuff,
>
>
> That would indeed be a good step forward. I'm interested to try to help
> get to that point for Numpy and Scipy.
>
> I’m not sure what the diff between the current state and what
>> they need to be are but if someone spells it out (I’ve only just skimmed
>> your last email so perhaps it’s contained in that!) I’ll do the arguing
>> for it. I
>> just need someone who actually knows what’s needed to advise me :)
>>
>
> To start with, the SSE stuff. Numpy and scipy are distributed as
> "superpack" installers for Windows containing three full builds: no SSE,
> SSE2 and SSE3. Plus a script that runs at install time to check which
> version to use. These are built with ``paver bdist_superpack``, see
> https://github.com/numpy/numpy/blob/master/pavement.py#L224. The NSIS and
> CPU selector scripts are under tools/win32build/.
>
> How do I package those three builds into wheels and get the right one
> installed by ``pip install numpy``?
>
> If this is too difficult at the moment, an easier (but much less important
> one) would be to get the result of ``paver bdist_wininst_simple`` as a
> wheel.
>
> For now I think it's OK that the wheels would just target 32-bit Windows
> and python.org compatible Pythons (given that that's all we currently
> distribute). Once that works we can look at OS X and 64-bit Windows.
>
> Ralf
>
> _______________________________________________
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
>
> _______________________________________________
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
>
_______________________________________________
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to