Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Nick Coghlan
On 21 May 2015 at 10:52, Wes Turner  wrote:
> On May 20, 2015 7:43 PM, "Nick Coghlan"  wrote:
>> One of my hopes for the metadata extension system in PEP 426 is that
>> we'll be able to define extensions like "fedora.repackage",
>> "debian.repackage"  or "conda.repackage" which include whatever
>> additional info is needed to automate creation of a policy compliant
>> downstream package in a format that's a purely additive complement to
>> the upstream metadata, rather than being somewhat duplicative as is
>> the case today with things like spec files, deb control files, and
>> conda recipes.
>
> http://conda.pydata.org/docs/bdist_conda.html bdist_conda?

conda has the benefit of *not* renaming Python packages in convoluted
ways that interfere with automated identification of dependencies :)

Both conda and Linux distros run into the "it's difficult/impossible
to describe external binary dependencies in a cross-platform way"
problem, though. While https://www.biicode.com/ is interesting in the
context of CMake based projects, that still excludes a lot of
software. (RYPPL is another I'd heard of, but it's GitHub repo hasn't
seen much activity since 2013, and ryppl.org appears to be entirely
dead)

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Nick Coghlan
On 20 May 2015 at 23:30, Daniel Holth  wrote:
> It occurs to me that the setuptools packaging in general is more like
> a shared library format .so or .dll, aka libraries searched for along
> a path, than an OS level package manager.

Yep, that was what PJE was after for Chandler, so that's what he
built. It was just useful enough for other folks that it was adopted
well beyond that original use case.

The key benefit it offered at the time was that pkg_resources could
use sys.path + the assumption of directory or zip archive based
installation to implement searching for metadata, which avoided the
need to come up with an alternative cross-platform approach to
metadata storage and retrieval.

A related potentially interesting project I've never had time to
pursue is an idea for a virtualenv friendly installation layout that
doesn't quite lead to the same kind of version proliferation as
setuptools did (or as something like NixOS does), while remaining
compatible with sys.path based metadata discovery.

The essential concept would be to assume semantic versioning for
shared installations, and install the shared packages into directories
named as "package/MAJOR_VERSION/".
In each virtualenv that opts in to using the shared package rather
than its own bundled copy, you'd then install a *.pth file that added
"package/MAJOR_VERSION" to sys.path in that environment.

This would be similar in principle to the way Nix user profiles work
(http://nixos.org/releases/nix/nix-0.12/manual/#sec-profiles), but
adapted to be compatible with Python's existing *.pth file mechanism.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Wes Turner
On May 20, 2015 7:43 PM, "Nick Coghlan"  wrote:
>
> On 21 May 2015 at 05:05, Wes Turner  wrote:
> >
> >
> > On Wed, May 20, 2015 at 12:13 PM, Chris Barker 
> > wrote:
> 
>  The package includes its build recipe in info/recipe
> >>>
> >>>
> >>> very cool -- I hadn't seen that -- I'll go take a look at some
packages
> >>> and see what I can find.
> >>
> >>
> >> Darn -- the recipe is not there in most (all?) of the packages that
came
> >> from Anaconda -- probably due to the legacy issues David referred to.
> >
> > The other day, I upgraded the version of conda-recipes/arrow to v0.5.4,
and
> > added ofxparse.
> >
> > I should probably create some sort of recurring cron task to show how
far
> > behind stable the version number in the meta.yaml is. (see: conda
skeleton
> > --version-compare issue/PR (GH:conda/conda-build))
>
> https://release-monitoring.org/ is a public service for doing that
> (more info on supported upstream backends at
> https://release-monitoring.org/about, more info on the federated
> messaging protocol used to publish alerts at
> http://www.fedmsg.com/en/latest/)
>
> Anitya (the project powering release-monitoring.org) was built as the
> "monitoring" part of Fedora's upstream release notification pipeline:
> https://fedoraproject.org/wiki/Upstream_release_monitoring

Thanks!

>
> One of my hopes for the metadata extension system in PEP 426 is that
> we'll be able to define extensions like "fedora.repackage",
> "debian.repackage"  or "conda.repackage" which include whatever
> additional info is needed to automate creation of a policy compliant
> downstream package in a format that's a purely additive complement to
> the upstream metadata, rather than being somewhat duplicative as is
> the case today with things like spec files, deb control files, and
> conda recipes.

http://conda.pydata.org/docs/bdist_conda.html bdist_conda?

>
> Regards,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Nick Coghlan
On 21 May 2015 at 05:05, Wes Turner  wrote:
>
>
> On Wed, May 20, 2015 at 12:13 PM, Chris Barker 
> wrote:

 The package includes its build recipe in info/recipe
>>>
>>>
>>> very cool -- I hadn't seen that -- I'll go take a look at some packages
>>> and see what I can find.
>>
>>
>> Darn -- the recipe is not there in most (all?) of the packages that came
>> from Anaconda -- probably due to the legacy issues David referred to.
>
> The other day, I upgraded the version of conda-recipes/arrow to v0.5.4, and
> added ofxparse.
>
> I should probably create some sort of recurring cron task to show how far
> behind stable the version number in the meta.yaml is. (see: conda skeleton
> --version-compare issue/PR (GH:conda/conda-build))

https://release-monitoring.org/ is a public service for doing that
(more info on supported upstream backends at
https://release-monitoring.org/about, more info on the federated
messaging protocol used to publish alerts at
http://www.fedmsg.com/en/latest/)

Anitya (the project powering release-monitoring.org) was built as the
"monitoring" part of Fedora's upstream release notification pipeline:
https://fedoraproject.org/wiki/Upstream_release_monitoring

One of my hopes for the metadata extension system in PEP 426 is that
we'll be able to define extensions like "fedora.repackage",
"debian.repackage"  or "conda.repackage" which include whatever
additional info is needed to automate creation of a policy compliant
downstream package in a format that's a purely additive complement to
the upstream metadata, rather than being somewhat duplicative as is
the case today with things like spec files, deb control files, and
conda recipes.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] PyPI and Uploading Documentation

2015-05-20 Thread Nick Coghlan
On 15 May 2015 at 23:48, Donald Stufft  wrote:
> I think that it's time to retire this aspect of PyPI which has never been well
> supported and instead focus on just the things that are core to PyPI. I don't
> have a fully concrete proposal for doing this, but I wanted to reach out here
> and figure out if anyone had any ideas.

Re-reading the current draft of PEP 470, it's worth noting we
currently mention pythonhosted.org in as a means of hosting a separate
index page for a project that would like to host packages externally:
https://www.python.org/dev/peps/pep-0470/#deprecation-and-removal-of-link-spidering

That's not a reason to avoid deprecating the documentation hosting
functionality in favour of ReadTheDocs and static HTML hosting
services, just something to take into account.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Nick Coghlan
On 21 May 2015 at 08:46, Nick Coghlan  wrote:
> On 21 May 2015 at 03:37, Chris Barker  wrote:
>> As such, it _could_ play the role that pip+wheel (secondarily pypi) play in
>> the python ecosystem.
>
> In practice, it can't, as conda is entirely inappropriate as an input
> format for yum/apt/enstaller/zc.buildout/pypm/MSI/etc. In many ways,
> the barriers that keep conda from being a viable competitor to pip
> from an upstream perspective are akin to those that felled the
> distutils2 project, while the compatible-with-the-existing-ecosystem
> d2to1 has seen far more success.

I think I've finally figured out a short way of describing these
"packaging ideas that simply won't work": if an ecosystem-wide
packaging proposal doesn't work for entirely unmaintained PyPI
packages, it's likely a bad proposal.

This was not only the fatal flaw in the previous distribute/distutils2
approach, it's the reason we introduced so much additional complexity
into PEP 440 in order to preserve compatibility with the vast majority
of existing package versions on PyPI (over 98% of existing version
numbers were still accepted), it's one of the key benefits of
separating the PyPI-to-end-user TUF PEP from the dev-to-end-user one,
and it's the reason why the "Impact assessment" section is one of the
most important parts of the proposal in PEP 470 to migrate away from
offering the current link spidering functionality
(https://www.python.org/dev/peps/pep-0470/#id13).

Coping with this problem is also why injecting setuptools when running
vanilla distutils projects is one of the secrets of pip's success: by
upgrading setuptools, and by tweaking the way pip invokes setup.py
with it injected, we can change the way packages are built and
installed *without* needing to change the packages themselves.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Nick Coghlan
On 21 May 2015 at 03:37, Chris Barker  wrote:
> As such, it _could_ play the role that pip+wheel (secondarily pypi) play in
> the python ecosystem.

In practice, it can't, as conda is entirely inappropriate as an input
format for yum/apt/enstaller/zc.buildout/pypm/MSI/etc. In many ways,
the barriers that keep conda from being a viable competitor to pip
from an upstream perspective are akin to those that felled the
distutils2 project, while the compatible-with-the-existing-ecosystem
d2to1 has seen far more success.

Rather than being strictly technical, the reasons for this are mostly
political (and partially user experience related) so it's not worth
the futile effort of attempting to change them. When folks try anyway,
it mainly serves to alienate people using (or working on) other
integration platforms rather than achieving anything productive (hence
my comment about the "one package manager to rule them all" attitude
of some conda proponents, although I'll grant they haven't yet gone as
far as the NixOS folks by creating an entirely conda based Linux
distro).

The core requirement for the upstream tooling is to be able to bridge
the gap from publishers of software components implemented in Python
to integrators of software applications and development environments
(regardless of whether those integrators are themselves end users,
redistributors or both). That way, Python developers can focus on
learning one publication toolchain (anchored by pip & PyPI), while
users of integrated platforms can use the appropriate tools for their
platform.

conda doesn't bridge that gap for Python in the general case, as it is
itself an integrator tool managed independently of the PSF and
designed to consume components from *multiple* language ecosystems and
make them available to end users in a common format.

Someone designing a *new* language ecosystem today could quite
reasonably decide not to invent their own distribution infrastructure,
and instead adopt conda as their *upstream* tooling, and have it be
the publication toolchain that new contributors to that ecosystem are
taught, and that downstream integrators are expected to interoperate
with, but that's not the case for Python - Python's far too far down
the distutils->setuptools->pip path to be readily amenable to
alternatives (especially alternatives that are currently still fairly
tightly coupled to the offerings of one particular commercial
redistributor).

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Wes Turner
On Wed, May 20, 2015 at 12:13 PM, Chris Barker 
wrote:

> The package includes its build recipe in info/recipe
>>>
>>
>> very cool -- I hadn't seen that -- I'll go take a look at some packages
>> and see what I can find.
>>
>
> Darn -- the recipe is not there in most (all?) of the packages that came
> from Anaconda -- probably due to the legacy issues David referred to.
>

The other day, I upgraded the version of conda-recipes/arrow to v0.5.4, and
added ofxparse.

I should probably create some sort of recurring cron task to show how far
behind stable the version number in the meta.yaml is. (see: conda skeleton
--version-compare issue/PR (GH:conda/conda-build))


>
> And since a conda package is "just" a tar archive, you can presumably
> build them in other ways than a conda build recipe.
>
> By the way -- libxml is one example of one without a recipe...
>

Hours of compilation time.

* https://www.google.com/#q=inurl:libxml2+conda+meta.yaml (3 results)
* https://pypi.python.org/pypi?%3Aaction=search&term=buildout+libxml
  * https://pypi.python.org/pypi/z3c.recipe.staticlxml/0.10


>
> -Chris
>
> --
>
> Christopher Barker, Ph.D.
> Oceanographer
>
> Emergency Response Division
> NOAA/NOS/OR&R(206) 526-6959   voice
> 7600 Sand Point Way NE   (206) 526-6329   fax
> Seattle, WA  98115   (206) 526-6317   main reception
>
> chris.bar...@noaa.gov
>
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Chris Barker
On Wed, May 20, 2015 at 12:57 AM, Nick Coghlan  wrote:

> This is why I'm such a big fan of richer upstream metadata with
> automated conversion to downstream formats as my preferred long term
> solution - this isn't a "pip vs conda" story, it's "pip vs conda vs
> yum vs apt vs MSI vs nix vs zypper vs zc.buildout vs enstaller vs PyPM
> vs ".


hopefully not "versus", but "working with" ;-) -- but very good point. If
python can do things to make it easier for all these broader systems,
that's a "good thing"


> The main differences I see with conda relative to the other downstream
> package management systems is that it happened to be made by folks
> that are also heavily involved in development of Python based data
> analysis tools,


Which is to say Python itself.


> and that some of its proponents want it to be the "one
> package management tool to rule them all".


I don't know about that -- though another key point is that it is cross
platform (platform independent) -- it may be the only one that does that
part well.


> I consider the latter
> proposal to be as outlandish an idea as believing the world only needs
> one programming language - just as with programming languages,
> packaging system design involves making trade-offs between different
> priorities, so you can't optimise for everything at once. conda's an
> excellent cross-platform end user focused dependency management
> system. This is a good thing, but it does mean conda isn't a suitable
> candidate for use as an input format for other tools that compete with
> it.
>

Hmm -- that's true. But it is, as you said " cross-platform end user
focused dependency management system" that handles python well, in addition
to other things, including libs python may depend on.

As such, it _could_ play the role that pip+wheel (secondarily pypi) play in
the python ecosystem. You'd still need something like distutils and/or
setuptools to actually handle the building, etc.

And IF we wanted the "official" package manager for python to fully support
dynamic libs, etc, as well  as non-python associated software, then it
would make sense to use conda, rather than keep growing pip_wheel until it
duplicated conda's functionality.

But I don't get the impression that that is an end-goal for PyPa, and I'm
not sure it should be.

As far as the "we could use a better dynamic linking story for Windows
> and Mac OS X" story goes, now that I understand the general *nix case
> is considered out of scope for the situations Chris is interested in,
>

exactly, -- just like it linux is out of scope for compiled wheels

I think there's a reasonable case to be made for being able to
> *bundle* redistributable dynamically linked libraries with a wheel
> file, and for the build process of *other* wheel files to be able to
> rely on those bundled external libraries.


yup -- that's what I have in mind.


> I originally thought the
> request was about being able to *describe* the external dependencies
> in sufficient detail that the general case on *nix could be handled,
> or that an appropriate Windows or Mac OS X binary could be obtained
> out of band, rather than by being bundled with the relevant wheel
> file.
>

Sure would be nice, but no, -- I have no fantasies about that.

Getting a bundling based model to work reliably is still going to be
> difficult (and definitely more complicated than static linking in
> cases where data sharing isn't needed), but it's not intractable the
> way the general case is.
>

Glad you agree -- so the rabbit hole may not be that deep?

There isn't much that should change in pip+wheel+metadata to enable this.
So the way to proceed, if someone wants to do it, could be to simply hack
together some binary wheels of a common dependency or two, build wheels for
a package or two that depend on those, and see how it works.

I dont know if/when I'll find the roundtoits to do that -- but I have some
more detailed ideas if anyone wants to talk about it.

Then it becomes a social issue -- package maintainers would have to
actually use these new sharedlib wheels to build against. But that isn't
really that different than the current case of deciding whether to include
a copy of a dependent python package in your distribution -- and one we
made it easy for users to get dependencies, folks have been happy to shift
that burden elsewhere.

-Chris


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Chris Barker
>
> The package includes its build recipe in info/recipe
>>
>
> very cool -- I hadn't seen that -- I'll go take a look at some packages
> and see what I can find.
>

Darn -- the recipe is not there in most (all?) of the packages that came
from Anaconda -- probably due to the legacy issues David referred to.

And since a conda package is "just" a tar archive, you can presumably build
them in other ways than a conda build recipe.

By the way -- libxml is one example of one without a recipe...

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Chris Barker
On Wed, May 20, 2015 at 6:30 AM, Daniel Holth  wrote:

> The package includes its build recipe in info/recipe
>

very cool -- I hadn't seen that -- I'll go take a look at some packages and
see what I can find.

-CHB

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Help required for setup.py

2015-05-20 Thread Chris Barker
On Tue, May 19, 2015 at 4:12 PM, salil GK  wrote:

>   I will provide more details about what I need to achieve
>
> I need to create a package for a tool that I create. Actually the tool
> that I created is a wrapper over ovftool which is provided by VMWare.
> ovftool install binary is provided as a bundle hence there is no package
> installed in the system ( `dpkg -l`  will not list ovftool package ).
> ovftool will be installed in /usr/bin/ location.
>
>While creating the package I need to check if ovftool is available in
> the system and the version is 4.1.0. If it is not compatible, I need to
> fail the package installation with proper message. So how do I write
> setup.py for achieving the same.
>

you can put arbitrary python code in setup.py. so before you call setup()
in the file, put something like:

import subprocess

try:
version = subprocess.check_output(['/usr/bin/ovftool','--version'])
except subprocess.CalledProcessError:
print "ovftool is not properly installed"
raise
if not is_this_the_right_version(version):
raise ValueError("ovftool is not the right version")


of course, you'd probably want better error messages, etc, but hopefully
you get the idea.

-CHB







> Thanks
> Salil
>
> On 19 May 2015 at 07:54, salil GK  wrote:
>
>> Hello
>>
>>I was trying to create my package for distribution. I have a
>> requirement that I need to check if one particular command is available in
>> the system ( this command is not installed through a package - but a bundle
>> is installed to get the command in the system ). I am using Ubuntu 14.04
>>
>> Thanks in advance
>> Salil
>>
>
>
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
>


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Help required for setup.py

2015-05-20 Thread salil GK
Hello

  I will provide more details about what I need to achieve

I need to create a package for a tool that I create. Actually the tool that
I created is a wrapper over ovftool which is provided by VMWare. ovftool
install binary is provided as a bundle hence there is no package installed
in the system ( `dpkg -l`  will not list ovftool package ). ovftool will be
installed in /usr/bin/ location.

   While creating the package I need to check if ovftool is available in
the system and the version is 4.1.0. If it is not compatible, I need to
fail the package installation with proper message. So how do I write
setup.py for achieving the same.

Thanks
Salil

On 19 May 2015 at 07:54, salil GK  wrote:

> Hello
>
>I was trying to create my package for distribution. I have a
> requirement that I need to check if one particular command is available in
> the system ( this command is not installed through a package - but a bundle
> is installed to get the command in the system ). I am using Ubuntu 14.04
>
> Thanks in advance
> Salil
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Chris Barker
On Wed, May 20, 2015 at 1:04 AM, Paul Moore  wrote:

> > https://github.com/menpo/conda-recipes/tree/master/libxml2
> >
> > don't know anything about it.
>
> OK, I'm still misunderstanding something, I think. As far as I can
> see, all that does is copy a published binary and repack it. There's
> no "build" instructions in there.
>

indeed -- that is one way to buld a conda pacakge, as you well know!

maybe no one has done a "proper" build from scratch recipe for that one --
or maybe continuum has, and we'll find out about it from David

-Chris



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Paul Moore
On 20 May 2015 at 15:53, David Mertz  wrote:
> It's not *only* the 'setup.py install', but it's not *that* much mystery
> either.  wxPython I can't seem to find, not sure what I'm missing.

Yeah, I had been under the impression that there was a lot of
knowledge on how to build the dependencies (things like libyaml or
whatever) in there. Looks like that's not the case...

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread David Mertz
On Tue, May 19, 2015 at 4:11 PM, Chris Barker  wrote:
>
> On Tue, May 19, 2015 at 3:09 PM, Paul Moore  wrote:
> "conda" -- a fully open source package management system
> "Anaconda" -- a python and other stuff distribution produced by
Continuum.
>
> How Continuum does or doesn't publish the recipes it used to build
 Anaconda doesn't really have anything to do with conda-the-technology.

True.  Also though, in answer to the a question here, I asked a Continuum
colleague on the conda team.  It seems that Anaconda was built using a
proprietary system before conda-build and conda-recipes was opened, so not
all recipes have made it over to the Free side of the fence yet.  But
y'know, gh:conda-build *is*  a public repository, anyone could add more.

>>
>> I will note that most recipes seem to consist of either 'python setup.py
install' or './configure; make; make install'.
>
>
> sure -- but those aren't the ones we want ;-)

Understood.

>
> see if you can find the wxPython one, while you are at it :-)
>   --  though I suspect that was built from the "official" executable,
rather than re-built from scratch.

In the case of pyyaml, this is actually what's "behind the wall"

>
> #!/bin/bash
> patch -p0 < --- setup.cfg~ 2011-05-29 22:31:18.0 -0500
> +++ setup.cfg 2012-07-10 20:33:50.0 -0500
> @@ -4,10 +4,10 @@
> [build_ext]
> # List of directories to search for 'yaml.h' (separated by ':').
> -#include_dirs=/usr/local/include:../../include
> +include_dirs=$PREFIX/include
> # List of directories to search for 'libyaml.a' (separated by ':').
> -#library_dirs=/usr/local/lib:../../lib
> +library_dirs=$PREFIX/lib
> # An alternative compiler to build the extention.
> #compiler=mingw32
> EOF
> $PYTHON setup.py install

It's not *only* the 'setup.py install', but it's not *that* much mystery
either.  wxPython I can't seem to find, not sure what I'm missing.

-- 
The dead increasingly dominate and strangle both the living and the
not-yet born.  Vampiric capital and undead corporate persons abuse
the lives and control the thoughts of homo faber. Ideas, once born,
become abortifacients against new conceptions.
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Making pip and PyPI work with conda packages

2015-05-20 Thread Daniel Holth
The conda package specification is published at
http://conda.pydata.org/docs/spec.html

The file format is nice and simple. "A conda package is a bzipped tar
archive (.tar.bz2) which contains metadata under the info/ directory,
and a collection of files which are installed directly into an install
prefix. The format is identical across platforms and operating
systems. It is important to note that during the install process, all
files are basically just extracted into the install prefix, with the
exception of the ones in info/."

(Compare to the Debian package format's embedded metadata and content archives.)

It has concise metadata in info/index.json

{
  "arch": "x86_64",
  "build": "py27_138_g4f40f08",
  "build_number": 138,
  "depends": [
"jinja2",
"jsonpointer",
"jsonschema",
"mistune",
"pandoc",
"pygments",
"python 2.7*",
"pyzmq",
"terminado",
"tornado"
  ],
  "license": "MIT License",
  "name": "ipython-we",
  "platform": "linux",
  "subdir": "linux-64",
  "version": "3.1.0"
}

The package includes its build recipe in info/recipe

This particular package has setuptools metadata in
lib/python2.7/site-packages/ipython-3.1.0-py2.7.egg-info

On the index, conda packages are organized by placing packages for a
platform+architecture in their own (sub)directory, not by putting all
that information in the filename. According to the docs it doesn't
interpret the platform metadata.

When conda installs a package, it gets unpacked into a common
directory and then linked into each environment, so that it can be
installed to lots of environments without taking up much extra disk
space. Packages can have link and unlink scripts to provide custom
behavior (perhaps fixing up some paths for files that can't just be
linked) when this happens.

It occurs to me that the setuptools packaging in general is more like
a shared library format .so or .dll, aka libraries searched for along
a path, than an OS level package manager.
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Paul Moore
On 20 May 2015 at 00:04, Chris Barker  wrote:
> yup. which makes me think -- maybe not that hard to do a wininst to wheel
> converter for wxPython -- that would be nice. We also need it for the Mac,
> and that would be harder -- he's got some trickery in placing the libs in
> that one...

"wheel convert " already does that. I wrote it, and use
it a lot. It doesn't handle postinstall scripts (because wheels don't
yet) but otherwise should be complete.

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Paul Moore
On 19 May 2015 at 23:32, Chris Barker  wrote:
> lost track of where in the thred this was, but here's a conda recipe I found
> on gitHub:
>
> https://github.com/menpo/conda-recipes/tree/master/libxml2
>
> don't know anything about it.

OK, I'm still misunderstanding something, I think. As far as I can
see, all that does is copy a published binary and repack it. There's
no "build" instructions in there.

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Dynamic linking between Python modules (was: Beyond wheels 1.0: helping downstream, FHS and more)

2015-05-20 Thread Nick Coghlan
On 19 May 2015 at 09:43, Chris Barker  wrote:
> On Mon, May 18, 2015 at 11:21 AM, Paul Moore  wrote:
>> My honest view is that unless conda is intending to replace pip and
>> wheel totally, you cannot assume that people will be happy to use
>> conda alongside pip (or indeed, use any pair of independent packaging
>> tools together - people typically want one unified solution). And if
>> the scientific community stops working towards providing wheels for
>> people without compilers "because you can use conda", there is going
>> to be a proportion of the Python community that will lose out on some
>> great tools as a result.
>
>
> Exactly -- this idea that there are two (or more) non-overlapping
> communities is pretty destructive.

There's a cornucopia of *overlapping* communities. We only rarely hear
from system administrators upstream, for example, as they tend to be
mainly invested in particular operating system or configuration
management communities, leaving upstream mostly to developers and data
analysts. For these admins, a package management system is only going
to be potentially interesting if it is supported by their operating
system or configuration management tool of choice (e.g.
http://docs.ansible.com/list_of_packaging_modules.html for Ansible, or
some of the options linked from Salt's package management abstraction
layer: http://docs.saltstack.com/en/latest/ref/states/all/salt.states.pkg.html)

This is why I'm such a big fan of richer upstream metadata with
automated conversion to downstream formats as my preferred long term
solution - this isn't a "pip vs conda" story, it's "pip vs conda vs
yum vs apt vs MSI vs nix vs zypper vs zc.buildout vs enstaller vs PyPM
vs ". (in addition to the modules listed for Ansible and Salt, I
discovered yet another one today: https://labix.org/smart)

The main differences I see with conda relative to the other downstream
package management systems is that it happened to be made by folks
that are also heavily involved in development of Python based data
analysis tools, and that some of its proponents want it to be the "one
package management tool to rule them all". I consider the latter
proposal to be as outlandish an idea as believing the world only needs
one programming language - just as with programming languages,
packaging system design involves making trade-offs between different
priorities, so you can't optimise for everything at once. conda's an
excellent cross-platform end user focused dependency management
system. This is a good thing, but it does mean conda isn't a suitable
candidate for use as an input format for other tools that compete with
it.

As far as the "we could use a better dynamic linking story for Windows
and Mac OS X" story goes, now that I understand the general *nix case
is considered out of scope for the situations Chris is interested in,
I think there's a reasonable case to be made for being able to
*bundle* redistributable dynamically linked libraries with a wheel
file, and for the build process of *other* wheel files to be able to
rely on those bundled external libraries. I originally thought the
request was about being able to *describe* the external dependencies
in sufficient detail that the general case on *nix could be handled,
or that an appropriate Windows or Mac OS X binary could be obtained
out of band, rather than by being bundled with the relevant wheel
file.

Getting a bundling based model to work reliably is still going to be
difficult (and definitely more complicated than static linking in
cases where data sharing isn't needed), but it's not intractable the
way the general case is.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig