Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-24 Thread Matthias Baas
On Thu, May 23, 2013 at 11:41 PM, Chris Barker - NOAA Federal <
chris.bar...@noaa.gov> wrote:

> On Thu, May 23, 2013 at 2:29 PM, Matthias Baas 
> wrote:
>
> > From a user's point of view, I find that Windows installers as generated
> > by bdist_wininst still provide the nicest user experience with OSX
> > packages being a close second.
>
> second? Aren't they essentially the same experience? But anyway..
>

I may be wrong, but I thought pure Python packages that are distributed as
an OSX package are still targeted at one particular version of Python. On
Windows, the installer would provide a list of all the installed Python
versions and ask for which version you want to install the package.
Another difference is that on Windows, I can uninstall the package again.


> > You even mention pip as a solution to type 1 users and again, I do agree
> > with this. That's why I find it a bit surprising that in the remainder
> > of this thread, a lot of the discussion is about pip and virtualenv (as
> > far as I can tell, all the solutions that were mentioned were command
> > line solutions), even though you actually didn't want to target this
> > category of users.
>
> I don't agree -- I'm not opposed to requiring a command-line command
> or two -- really you can't get far as any kind of programmer if you
> can't type:
>

I see. I have seen people that only used Python by launching IDLE and doing
everything from there, they never got into contact with a terminal window.
Telling them to first install pip, then install some other packages, would
complicate things for them quite a bit. But I'm not saying we have to
support those people, it was just that your categorization of users
reminded me of those people.
If pip is more and more becoming the standard way of installing things,
then, of course, it definitely makes sense to follow its way of handling
packages. I suppose sooner or later, a GUI frontend for it will pop up so
that it becomes easier to install packages.


> The advantage of pip and friends is that it handles dependencies --
> simple binary installer don't do that for you.
>

I'm aware of that. From my personal experience, this has never been an
issue for me though. I'm only using a handful of third-party packages on a
more or less regular basis (such as PyQt, Sphinx, numpy, PIL, PyOpenGL,
pygame) and only occasionally install some other packages to try stuff out,
so for me, there hasn't really been a compelling reason why I should use
pip instead. (Maybe it's also that it reminds me too much to systems like
MacPorts and I have just seen MacPort fail too many times to put much trust
in such a system. But then, this always had to do with building packages
locally, whereas you are planning on providing binaries, so this won't be
an issue. Sorry for my rant!)


> I think my plan is now:
>
> The step to building and distributing a package:
>
> 1) build the deps
> 2) build the package
> 3) build the package installer
> 4) put the package installer up somewhere people can find it.
>
> Im going to work on a system for (1) and (2) first -- still not sure
> about the dynamic linking part
>
> For now, probably use bdist_mpkg for (3) -- when wheels are generally
> supported, maybe use them.
>
> If/when I get a few packages built, I'll think about where to put them up.
>
> Anyone have an idea for a host? I understand gitHub no longer really
> support binary distribution...
>

Sounds all good and promising! As for web space, a lot of people don't seem
to like SourceForge anymore, but at least they do provide some web space
and infrastructure where you could put your stuff.

Cheers,

- Matthias -
___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG


Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-23 Thread Chris Barker - NOAA Federal
On Wed, May 22, 2013 at 11:53 PM, Ronald Oussoren
 wrote:
>> I'm using the
>> system zlib -- is that a bad idea? Should I build it too, to make sure
>> it matches the rest of it?
>>
>> (I do want the binaries to run anywhere the binary Python I'm using runs)
>
> It depends on the library.

OK -- it would be nice to have rule to follow, but I guess that
decision should be made for each particular lib.

> I agree w.r.t. homebrew and macports, but it would be nice if 'pip install' 
> would work with your system with minimal changes to the pip configuration 
> (e.g. "just add ... to your piprc and then 'pip install foo' will install a 
> binary from the repo instead of building the binaries itself").

yes -- it sure would -- though this is enough to do without trying to
hack on pip as well

>> Yeah, and he never gave anyone else permission to push to it...
>
> I wouldn't have done that either until the someone else has a proven 
> trackrecord (both in providing usable binaries and in being known in the 
> community).

well, when I say anyone else, I mean anyone else -- I, and a few
others regularly contributed, but it involved sending it to him, and
hoping he'd find the time to put it up...So if I do this, I'd like to
have at least the option of a handful of folks contributing directly.

>> But if we put the shared libs in amore central location, then all your
>> virtual-ens could use the same ones, yes?
>
> Yes. It would make it harder to switch library versions, but not by much.

hmm -- this is getting tricky for me to wrap my head around -- could
we make it so a pip install inside a virtual env would install a
dependency in the main python? Or would you have people install the
libs outside of the virtualenv first, if they didn't want multiple
copies ?

> Uninstall can be a problem with that, you'd have to refcount installed files 
> to ensure that libraries are only removed when the last user is uninstalled. 
> I don't know if the installation format used by pip supports having two 
> packages that install the same file.

and I don't want to write that code anyway ;-)

> This can be worked around with fake PyPI packages that only install the 
> shared libraries and have the real packages depend on that (that is a 
> "macbins-libpng" package with libpng.dylib and have the Imaging package 
> depend on that).

I like that idea -- and It looks like that's how Anaconda deals with it.

> /Library can be used, we'd just have to pick a name that Apple is unlikely to 
> use.

I think it should go in /Library/Frameworks/Python somewhere, helps
the uninstall issue -- if folks clear that out, they won't have
anything left behind.

> I'm probably atypical, but my main account doesn't have admin privileges. It 
> would suck if I'd have to use sudo to install.

How do you install the python from Python.org? I"m just thinking we
should match that...

> The @loader_path option you mentioned in a followup e-mail could help there. 
> That way the shared libraries can be installed in a fixed location relative 
> to sys.prefix, while still supporting virtualenvs. You wouldn't be able to 
> share shared libraries between python versions or virtualenvs, but that's not 
> really a problem (disk space is cheap).

That may be the way to go.


>> Any idea what the time scale is on this?
>
> Before Python 3.4 is out, which means sometime this summer.

OK -- I figure I"ll wait until it's there, and then try it out.


>> Have the pip folks made any commitment at all to supporting binary
>> installs? That's a big missing feature.
>
> Yes, through wheels. The development branch in pips repo 
> () contains support for wheels 
> (both creating and installing), although AFAIK installation of pips requires 
> a command-line argument at the moment because wheel support is experimental 
> at this point.

fair enough -- have you looked into the universal binary issue at all?
easy_install was always getting confused by universal binaries...

> I'll provide mental support at the least, and hope to do more than that but 
> don't know if I can do that time-wise.

You've provided an enormous amount already!

> If wx is hard to package it would be a good stress test of the tools, even if 
> you'd end up not distributing the binaries :-)

yup -- but I'm still not sure if I want to deal with it! -- we'll see.
Maybe Robin would be interested in supporting this shared lib system
if we do get to that.

I'm thinking of setting up a gitHub project for this..I'll let you all
know if/when I do.

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: 

Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-23 Thread Chris Barker - NOAA Federal
On Thu, May 23, 2013 at 2:29 PM, Matthias Baas  wrote:

> From a user's point of view, I find that Windows installers as generated
> by bdist_wininst still provide the nicest user experience with OSX
> packages being a close second.

second? Aren't they essentially the same experience? But anyway..

> You even mention pip as a solution to type 1 users and again, I do agree
> with this. That's why I find it a bit surprising that in the remainder
> of this thread, a lot of the discussion is about pip and virtualenv (as
> far as I can tell, all the solutions that were mentioned were command
> line solutions), even though you actually didn't want to target this
> category of users.

I don't agree -- I'm not opposed to requiring a command-line command
or two -- really you can't get far as any kind of programmer if you
can't type:

pip install the_package_I_want.

Plus, all sorts of sources will tell people that that's how you
install a package - it would be great if it "just worked" on the Mac,
even for complex packages

That's a whole lot less than expecting people to do the whole
".configure; make; make install" dance.

And if someone wants to make a point_and_click wrapper around pip -- great!

The advantage of pip and friends is that it handles dependencies --
simple binary installer don't do that for you.

Also virtualenv is not just an advanced-user problem. It is THE
solution to the python equivalent of DLL Hell. I'm teaching an intro
to Python class, and another instructor (who teaches the web
development art) considers it such an essential tool that we are
considering introducing it very early in the class.

I've always thought that teaching newbies a simpler way to do things
was a bad approach, if it's not the way they should be doing it for
"real work". And if we don't support virtual env, then someone gets
rolling eveything is fine and dandy, then it comes time to deploy, and
the need a virtualenv to keep their dev environment and deployment
environment separate, and WHAM -- they'g got to figure out a whole
new, and much more difficult way to install packages. If we can make
it easy for newbies, but still be usuable for proper work, we're much
better off.

Finally -- we can have one way that packages are built and installed,
and more than one way to distribute and install them -- i.e. a mpkg,
and also a wheel, when that is properly supported.

>> 2) Static or dynamic?

> Also, it seems like a waste to me
>> for packages that use common dependencies -- how many copies of libpng
>> do I really want linked into my single instance of Python at run time?
>
> Personally, that doesn't really bother me too much, at least not in the
> context of Python development. It's not that all the packages I'm using
> are linked against libpng and that I need all those packages in the same
> program.

I could easily have three copies of libpng at once: wxPython,
matplotlib, PIL. maybe more! But maybe I don't care -- memory is
pretty cheap these days...

> If static linking is not an option and you need to ship a dynamic
> library, I would favor a self-contained solution where the dynamic
> library is just stored in the same directory where the Python extensions
> are that are actually using the library. It seems that with the
> @loader_path mechanism you were mentioning in another email, it
> shouldn't be a problem to implement this.

I think that's the way to go if what we really want is static linking,
but can't do that for some reason with a particular lib...but it
otherwise kills all the advantages of a dynamic lib, so what's the
point?

> I wouldn't bother trying to share libraries across packages (or Python
> installations) for the following reasons:
>
> - By moving the dynamic library outside the "domain" of the package, it
> has become an external dependency and the package provider cannot
> guarantee anymore that the package will really work on the user's
> machine throughout its lifetime. Essentially, the responsibility of
> ensuring compatibility has been moved from the package provider to the user.

not quite if the package provider is also the dependency provider, but
you are right, it's harder to control.

> So it could happen that I install package A
> and everything is fine until at some point I install package B which
> overwrites a shared dynamic library and suddenly package A doesn't work
> anymore.

That's exactly why we really don't want to use a standard system
location, like /usr/local...

However, my vision is that building these things will be something of
a coordinated effort, so everything dumped in to our lib location
would have been designed to work together -- maybe that's a fantasy.

As I think about it, I guess I'm proposing something like EPD or
Anaconda, but free and community-developed. Or even Chris Gohlke's
repository of Windows binaries -- they are all built to work together.

> - Uninstalling is more straightforward as I can just remove the main
> directory where the package is in

Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-23 Thread Matthias Baas
On 22.05.13 18:30, Chris Barker - NOAA Federal wrote:
> Users also fall into two categories:
> 
> 1) Folks that do Python development on OS-X much like Linux, etc --
> these folks are likely to use macports or homebrew, or are used to the
> .configure, make, make install dance. We don't need to do anything to
> support these folks -- "pip install" generally works for them.
> 
> 2) folks that want to use a Mac like a Mac, and people that develop
> for those folks --  these people need binary installers, and may want
> to be able to use and deploy either packages or applications (Py2app)
> that will run on systems older than the one developed on, or want
> universal builds, or ???
>   - These are the folks I'd like to support, but I'm still unsure as
> to how best to do that.

I agree, it would be a nice thing to have such a binary repository
again. Thanks for trying to tackle this!

>From a user's point of view, I find that Windows installers as generated
by bdist_wininst still provide the nicest user experience with OSX
packages being a close second. From your user category description it
also sounds like this would be the type of user experience that would be
suitable for type 2 users.

You even mention pip as a solution to type 1 users and again, I do agree
with this. That's why I find it a bit surprising that in the remainder
of this thread, a lot of the discussion is about pip and virtualenv (as
far as I can tell, all the solutions that were mentioned were command
line solutions), even though you actually didn't want to target this
category of users.

> How should the dependencies be distributed?
> 
> 1) They should be built to match the Python binary being targeted
> (honestly, I think that's now only the Intel 32-64 bit ones -- PPC
> machines, and pre 10.6, are getting really rare...)

Sounds all right to me (it wasn't actually too long ago that I was still
on 10.4 and I can confirm, it's no fun anymore :) )

> 2) Static or dynamic?
> 
> IIUC, most successful binary packages for the Mac have relied on
> statically linking the dependencies -- this works, and is pretty
> robust. However, it can be kind of a pain to do (though I've finally
> figure how to do it more reliably!). Also, it seems like a waste to me
> for packages that use common dependencies -- how many copies of libpng
> do I really want linked into my single instance of Python at run time?

Personally, that doesn't really bother me too much, at least not in the
context of Python development. It's not that all the packages I'm using
are linked against libpng and that I need all those packages in the same
program.

> But if dynamic, where do you put them? We'll still want to ship them
> with the binary, so people have a one-click install. 

If static linking is not an option and you need to ship a dynamic
library, I would favor a self-contained solution where the dynamic
library is just stored in the same directory where the Python extensions
are that are actually using the library. It seems that with the
@loader_path mechanism you were mentioning in another email, it
shouldn't be a problem to implement this.

I wouldn't bother trying to share libraries across packages (or Python
installations) for the following reasons:

- By moving the dynamic library outside the "domain" of the package, it
has become an external dependency and the package provider cannot
guarantee anymore that the package will really work on the user's
machine throughout its lifetime. Essentially, the responsibility of
ensuring compatibility has been moved from the package provider to the user.

- A build of a library may differ from another build of the same
library, even when they are built from the same version of the source
files. This is because it's not uncommon that libraries can be
configured at compile time (e.g. wide character support, single-threaded
vs multi-threaded, float vs double as fundamental number type, optional
support for any SSE variant, AVX, OpenGL, OpenCL, networking,
internationalization, etc.). So it could happen that I install package A
and everything is fine until at some point I install package B which
overwrites a shared dynamic library and suddenly package A doesn't work
anymore.

- Uninstalling is more straightforward as I can just remove the main
directory where the package is in and I don't have to worry about orphan
libs in some other directories of which I don't even know why they are
on my system.

- If the entire package is self-contained, moving the package or
changing the install location at install-time is not an issue anymore.

- It makes life easier for the package provider as well as you don't
have to worry how to manage shared libraries.


Cheers,

- Matthias -

___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG


Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-23 Thread Chris Barker - NOAA Federal
On Thu, May 23, 2013 at 12:45 AM, Samuel John  wrote:

> I am from the homebrew team and passionate python lover. I can almost feel
> your pain :-)

Thanks for joining the discussion -- really great to have a
homebrew-familiar person to discuss with.

However, and please to correct me if I'm wrong, but my understanding
of the goal of homebrew (and macports, and fink [anyone still use
fink?] ) is to make it easy to build and run stuff on your system, and
indeed, in a way that is optimized for your system (macports, at least
has a bunch of use flags you can set that will customize the build the
way you like it).

On the other hand, what I'm talking about here is supporting people
that want to:

Distribute binary packages that support:
  - installing on older OS versions (and maybe different
architectures) machines than the developer is running.
  - using those packages with Py2app, to distribute apps that will run
on older machines and OSs.

I _think_ that homebrew's goals don't support that. I know homebrew
supports an option to build "universal" (i386+x86_64) binaries, but:
  a) not all formulae support that -- in general, it's not well tested
  b) I don't think it will build with an older sdk, which you'd  need
to support the above.

In fact, on my machine (64 bit, OS-X 10.7), I'm sort-of-running
homebrew, but it only partially works because I've got XCode 3
installed -- and it keeps telling me I should upgrade. I haven't
upgraded XCode, because I haven't figured out how to get XCode 4 to
build the binaries I need (Or needed, it may be the PPC support that
I've lost, and I guess I can dump that now...)

Anyway, if homebrew could support the goals above, it may be a good
way to go to solve the problems at hand.

> Some things install nice with pip, but others don't. That is why I started
> to maintain a separate "tap" for additional homebrew python  formulae
...
> These formulae re-use and depend on other software
> built with homebrew (for example suite-sparse from homebrew/science,
> libpng).
> See https://github.com/samueljohn/homebrew-python if you like.

cool -- very nice!

> With our recent kickstarter supported Mac Minis, we plan to build and
> provide binary "bottles" for all our packages.
> Perhaps that is a good starting point?

If those binary bottles support older OS versions (more or less, the
oldest one that Apple is currently supporting -- I think that's 10.6
now), then yes, that could be a way to go.

> Homebrew works bests if installed at `/usr/local` because of the paths to
> other libs.

yup -- I can imagine! And I'm fine with homebrew in usr/local, but I
don't want something someone installs specifically for MacPython
(outside of homebrew) stomping on it.

> I am currently making Python 2.x and 3.x support in homebrew possible and
> simplifying how to install python software that provides a setup.py.

Question:

Does homebrew put Python in usr/local? and give you a regular old
unix-style install? Or do you get a "proper" Mac Framework install
(wherever it puts it). I guess the key question is can you use it to
develop proper desktop apps, and use py2app to make re-distributable
app bundles?

If so, and if it can support older machines with binaries, then maybe
we could build MacPyton on top of it (or with it).

But the key issue is if the goals of the homebrew project align with
the goals I outlined above.

Thoughts?

-Chris




-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG


Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-23 Thread Samuel John
Hi,

I am from the homebrew team and passionate python lover. I can almost feel
your pain :-)

Some things install nice with pip, but others don't. That is why I started
to maintain a separate "tap" for additional homebrew python  formulae like
pillow, numpy, scipy, matplotlib, pygame ... (one can get these with `brew
tap samueljohn/python` but in the future I want to move them to
homebrew/science or so). These formulae re-use and depend on other software
built with homebrew (for example suite-sparse from homebrew/science,
libpng).
See https://github.com/samueljohn/homebrew-python if you like.

With our recent kickstarter supported Mac Minis, we plan to build and
provide binary "bottles" for all our packages.
Perhaps that is a good starting point? Basically one could make an
installer that takes one or more bottles and installes them. Perhaps even
automate to create these installers. Perhaps it's possible to transform our
bottles into the wheel format (I have not looked into that).

Homebrew works bests if installed at `/usr/local` because of the paths to
other libs. Often rpath is enough to change that but some software has
other means to "remember" and hard-code paths, so in general, we build the
bottles in /usr/local and deploy them there, too.

I'd love to team up and join forces.

I am currently making Python 2.x and 3.x support in homebrew possible and
simplifying how to install python software that provides a setup.py.

bests,
 Samuel
___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG


Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-22 Thread Ronald Oussoren

On 23 May, 2013, at 0:46, Chris Barker - NOAA Federal  
wrote:

> Thanks Ronald,
> 
> On Wed, May 22, 2013 at 2:53 PM, Ronald Oussoren  
> wrote:
> 
>> To move back onto topic, not relying on unix-level libraries in OSX is in a 
>> good thing as it makes it easier to support multiple OSX versions with a 
>> single set of binaries.
> 
> hmm -- I figured if it was a system lib, it should work on whatever
> system It's running on. For example, I'm working right now on the
> netcdf4 lib -- it required hdr5, which requires zlib. I"m using the
> system zlib -- is that a bad idea? Should I build it too, to make sure
> it matches the rest of it?
> 
> (I do want the binaries to run anywhere the binary Python I'm using runs)

It depends on the library. Zlib should be fine, that library doesn't change 
very often and mostly in backward compatible ways. An example of a problematic 
library is OpenSSL, that doesn't a stable ABI and hence there are now two 
copies of libcrypto.dylib on OSX 10.8 and the one you'll link to by default is 
not available on all older versions of OSX (not sure how recently that was 
changed, but that's not important).

> 
> 
>> Except for a number of more complicated libraries (such as PIL/Pillow) when 
>> using universal binaries (when using 'pip install', homebrew/macports/... 
>> have their own mechanisms for building).
> 
> right -- Universal libs are not well supported by those systems -- but
> that's the power users problem!

I agree w.r.t. homebrew and macports, but it would be nice if 'pip install' 
would work with your system with minimal changes to the pip configuration (e.g. 
"just add ... to your piprc and then 'pip install foo' will install a binary 
from the repo instead of building the binaries itself").

> 
>>> 2) folks that want to use a Mac like a Mac, and people that develop
>>> for those folks --  these people need binary installers, and may want
>>> to be able to use and deploy either packages or applications (Py2app)
>>> that will run on systems older than the one developed on, or want
>>> universal builds, or ???
>>> - These are the folks I'd like to support, but I'm still unsure as
>>> to how best to do that.
>> 
>> It would be nice to have a set of binary "packages", based on a reproducable 
>> build system.
> 
> Exactly what I'd like to build!
> 
>>> Way back when Bob Ippolito maintained a repository of binary packages
>>> for the mac -- it was a great resource, but he's long since moved on
>>> to other things.
>> 
>> The binary packages that Bob maintained had IMHO two major problems:
>> 
>> 1) The largest problem is that the packages were AFAIK created ad-hoc (Bob 
>> or some other contributor did the magic incantations to build library 
>> dependencies)
> 
> Yeah, and he never gave anyone else permission to push to it...

I wouldn't have done that either until the someone else has a proven 
trackrecord (both in providing usable binaries and in being known in the 
community). 

> 
>> 2) The packages were Installer.app packages. The current s
> 
>> The header is easily updated using macholib, but that makes installation
>> harder and isn't supported by the standard packaging tools (easy_install
>> and pip)
> 
> But if we put the shared libs in amore central location, then all your
> virtual-ens could use the same ones, yes?

Yes. It would make it harder to switch library versions, but not by much.

> 
>> 2) The primary use case for dynamic linking is to share dylibs between 
>> extensions, and when those extensions are in different PyPI packages the 
>> packaging story gets more complicated. The easiest workaround is to ignore 
>> sharing dylibs and still bundle multipe copies of libpng if two different 
>> PyPI packages both link with libpng.
> 
> when you say bundle, do you mean static link? Or just package up the
> dylib with the bundle, which is what i was thinking -- each package
> installs the libs it needs, which may or may not already have been
> installed by another package -- but so what?

Uninstall can be a problem with that, you'd have to refcount installed files to 
ensure that libraries are only removed when the last user is uninstalled. I 
don't know if the installation format used by pip supports having two packages 
that install the same file.

This can be worked around with fake PyPI packages that only install the shared 
libraries and have the real packages depend on that (that is a "macbins-libpng" 
package with libpng.dylib and have the Imaging package depend on that).

> And I expect the number of folks building packages will be fairly
> small, so one builder would one have to build one set of dylibs.
> 
>>> But if dynamic, where do you put them? We'll still want to ship them
>> A new framework isn't necessary. There are three locations that could easily 
>> be used:
>> 
>> 1) A directory in Python.framework, for example 
>> /Library/Frameworks/Python.framework/Frameworks
> 
> That makes sense to me.
> 
>> 2) A directory in /Library/Python, fo

Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-22 Thread Ronald Oussoren

On 23 May, 2013, at 7:38, Chris Barker - NOAA Federal  
wrote:

> I just poked a bit into the Anaconda Python distribution. their
> packages are simple tarballs, but I think they have a dependency
> management system of some sort.
> 
> They deliver the dependencies as separate packages (tarballs), one for
> each lib. It looksl ike it all gets unpacked inot a sinlgle dir (in
> /opt), and an example python extension is built like this:
> 
> $ otool -L netCDF4.so
> netCDF4.so:
>   @loader_path/../../libnetcdf.7.dylib (compatibility version 10.0.0,
> current version 10.0.0)
>   @loader_path/../../libhdf5_hl.7.dylib (compatibility version 8.0.0,
> current version 8.3.0)
>   @loader_path/../../libhdf5.7.dylib (compatibility version 8.0.0,
> current version 8.3.0)
>   @loader_path/../../libz.1.dylib (compatibility version 1.0.0, current
> version 1.2.7)
>   /usr/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 
> 1.0.0)
>   /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current
> version 111.0.0)
> 
> 
> I don't know how to get that @loader_path thing in there, but this
> seems like a reasonable way to do it (though I guess it wouldn't
> support virtualenv...)

Interesting... @loader_path could work, it expands into the path of MachO binary
that links to the library (in your example to os.path.abspath(netCDF4.so). That
can be used to point to a fixed location in the sys.prefix tree, the relative 
path
for every extension could be different but would be known at build time. It 
might
require a macholib step when building installers, but that shouldn't be a 
problem.

@rpath could also work for installs outside of virtualenv-s, but would require
changes to the Python build.

Ronald

P.S. I need to check if macholib (and hence py2app) supports @loader_path, but 
adding
such support shouldn't be hard as the semantics are pretty easy.
___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG


Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-22 Thread Chris Barker - NOAA Federal
I just poked a bit into the Anaconda Python distribution. their
packages are simple tarballs, but I think they have a dependency
management system of some sort.

They deliver the dependencies as separate packages (tarballs), one for
each lib. It looksl ike it all gets unpacked inot a sinlgle dir (in
/opt), and an example python extension is built like this:

$ otool -L netCDF4.so
netCDF4.so:
@loader_path/../../libnetcdf.7.dylib (compatibility version 10.0.0,
current version 10.0.0)
@loader_path/../../libhdf5_hl.7.dylib (compatibility version 8.0.0,
current version 8.3.0)
@loader_path/../../libhdf5.7.dylib (compatibility version 8.0.0,
current version 8.3.0)
@loader_path/../../libz.1.dylib (compatibility version 1.0.0, current
version 1.2.7)
/usr/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 
1.0.0)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current
version 111.0.0)


I don't know how to get that @loader_path thing in there, but this
seems like a reasonable way to do it (though I guess it wouldn't
support virtualenv...)

-Chris



On Wed, May 22, 2013 at 3:46 PM, Chris Barker - NOAA Federal
 wrote:
> Thanks Ronald,
>
> On Wed, May 22, 2013 at 2:53 PM, Ronald Oussoren  
> wrote:
>
>> To move back onto topic, not relying on unix-level libraries in OSX is in a 
>> good thing as it makes it easier to support multiple OSX versions with a 
>> single set of binaries.
>
> hmm -- I figured if it was a system lib, it should work on whatever
> system It's running on. For example, I'm working right now on the
> netcdf4 lib -- it required hdr5, which requires zlib. I"m using the
> system zlib -- is that a bad idea? Should I build it too, to make sure
> it matches the rest of it?
>
> (I do want the binaries to run anywhere the binary Python I'm using runs)
>
>
>> Except for a number of more complicated libraries (such as PIL/Pillow) when 
>> using universal binaries (when using 'pip install', homebrew/macports/... 
>> have their own mechanisms for building).
>
> right -- Universal libs are not well supported by those systems -- but
> that's the power users problem!
>
>>> 2) folks that want to use a Mac like a Mac, and people that develop
>>> for those folks --  these people need binary installers, and may want
>>> to be able to use and deploy either packages or applications (Py2app)
>>> that will run on systems older than the one developed on, or want
>>> universal builds, or ???
>>>  - These are the folks I'd like to support, but I'm still unsure as
>>> to how best to do that.
>>
>> It would be nice to have a set of binary "packages", based on a reproducable 
>> build system.
>
> Exactly what I'd like to build!
>
>>> Way back when Bob Ippolito maintained a repository of binary packages
>>> for the mac -- it was a great resource, but he's long since moved on
>>> to other things.
>>
>> The binary packages that Bob maintained had IMHO two major problems:
>>
>> 1) The largest problem is that the packages were AFAIK created ad-hoc (Bob 
>> or some other contributor did the magic incantations to build library 
>> dependencies)
>
> Yeah, and he never gave anyone else permission to push to it...
>
>> 2) The packages were Installer.app packages. The current state of the art 
>> for development/project environments is to use virtualenv or buildout to 
>> create separated python installations and install all project depedencies 
>> there instead of the global site-packages directory. That's not something 
>> that's easily supported with Installer.app packages.
>
> It was the way to go at the time, but I agree a binary format that
> supports virtualenv would be great.
>
>>> do I really want linked into my single instance of Python at run time?
>>
>> As long as the libpng state isn't shared static linking isn't really a
>> problem.
>
> good to know, but somehow it still offends my sensibilities
>
>> Dynamic linking has at least two disadvantages:
>>
>> 1) Relocation of the installation prefix is harder due to the way the 
>> dynamic linker on OSX looks for libraries:
>
> yeah -- that is a pain.
>
>> The header is easily updated using macholib, but that makes installation
>> harder and isn't supported by the standard packaging tools (easy_install
>> and pip)
>
> But if we put the shared libs in amore central location, then all your
> virtual-ens could use the same ones, yes?
>
>> 2) The primary use case for dynamic linking is to share dylibs between 
>> extensions, and when those extensions are in different PyPI packages the 
>> packaging story gets more complicated. The easiest workaround is to ignore 
>> sharing dylibs and still bundle multipe copies of libpng if two different 
>> PyPI packages both link with libpng.
>
> when you say bundle, do you mean static link? Or just package up the
> dylib with the bundle, which is what i was thinking -- each package
> installs the libs it needs, which may or may not already have been
> installed by another pa

Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-22 Thread Chris Barker - NOAA Federal
Thanks Ronald,

On Wed, May 22, 2013 at 2:53 PM, Ronald Oussoren  wrote:

> To move back onto topic, not relying on unix-level libraries in OSX is in a 
> good thing as it makes it easier to support multiple OSX versions with a 
> single set of binaries.

hmm -- I figured if it was a system lib, it should work on whatever
system It's running on. For example, I'm working right now on the
netcdf4 lib -- it required hdr5, which requires zlib. I"m using the
system zlib -- is that a bad idea? Should I build it too, to make sure
it matches the rest of it?

(I do want the binaries to run anywhere the binary Python I'm using runs)


> Except for a number of more complicated libraries (such as PIL/Pillow) when 
> using universal binaries (when using 'pip install', homebrew/macports/... 
> have their own mechanisms for building).

right -- Universal libs are not well supported by those systems -- but
that's the power users problem!

>> 2) folks that want to use a Mac like a Mac, and people that develop
>> for those folks --  these people need binary installers, and may want
>> to be able to use and deploy either packages or applications (Py2app)
>> that will run on systems older than the one developed on, or want
>> universal builds, or ???
>>  - These are the folks I'd like to support, but I'm still unsure as
>> to how best to do that.
>
> It would be nice to have a set of binary "packages", based on a reproducable 
> build system.

Exactly what I'd like to build!

>> Way back when Bob Ippolito maintained a repository of binary packages
>> for the mac -- it was a great resource, but he's long since moved on
>> to other things.
>
> The binary packages that Bob maintained had IMHO two major problems:
>
> 1) The largest problem is that the packages were AFAIK created ad-hoc (Bob or 
> some other contributor did the magic incantations to build library 
> dependencies)

Yeah, and he never gave anyone else permission to push to it...

> 2) The packages were Installer.app packages. The current state of the art for 
> development/project environments is to use virtualenv or buildout to create 
> separated python installations and install all project depedencies there 
> instead of the global site-packages directory. That's not something that's 
> easily supported with Installer.app packages.

It was the way to go at the time, but I agree a binary format that
supports virtualenv would be great.

>> do I really want linked into my single instance of Python at run time?
>
> As long as the libpng state isn't shared static linking isn't really a
> problem.

good to know, but somehow it still offends my sensibilities

> Dynamic linking has at least two disadvantages:
>
> 1) Relocation of the installation prefix is harder due to the way the dynamic 
> linker on OSX looks for libraries:

yeah -- that is a pain.

> The header is easily updated using macholib, but that makes installation
> harder and isn't supported by the standard packaging tools (easy_install
> and pip)

But if we put the shared libs in amore central location, then all your
virtual-ens could use the same ones, yes?

> 2) The primary use case for dynamic linking is to share dylibs between 
> extensions, and when those extensions are in different PyPI packages the 
> packaging story gets more complicated. The easiest workaround is to ignore 
> sharing dylibs and still bundle multipe copies of libpng if two different 
> PyPI packages both link with libpng.

when you say bundle, do you mean static link? Or just package up the
dylib with the bundle, which is what i was thinking -- each package
installs the libs it needs, which may or may not already have been
installed by another package -- but so what?

And I expect the number of folks building packages will be fairly
small, so one builder would one have to build one set of dylibs.

>> But if dynamic, where do you put them? We'll still want to ship them
> A new framework isn't necessary. There are three locations that could easily 
> be used:
>
> 1) A directory in Python.framework, for example 
> /Library/Frameworks/Python.framework/Frameworks

That makes sense to me.

> 2) A directory in /Library/Python, for example /Library/Python/Externals

that feels a bit lke Apple's turf, but what do I know?

> 3) As 2), but in the users home directory (~/Library/Python/Externals)
> The latter is the only one where you can install without admin privileges.

But we put the python binaries  in /LIbrary/Frameworks -- it seems we
should do the same with libs...


> The folks over on distutils-sig are working towards support for wheels (PEP 
> 427, ) at least in pip and 
> distribute/setuptools and possibly in the stdlib as well (for 3.4). It would 
> be nice if the OSX package collection would be in wheel format, that would 
> make it relatively easy to install the packages using the defacto standard 
> tools.

Any idea what the time scale is on this?

> What I haven't looked into yet is how

Re: [Pythonmac-SIG] Advice wanted on dependency building...

2013-05-22 Thread Ronald Oussoren

On 22 May, 2013, at 19:30, Chris Barker - NOAA Federal  
wrote:

> Hey folks,
> 
> I'm looking for advice, and maybe even some consensus among the
> MacPython community, on how to build and distribute packages with
> non-python dependencies.
> 
> As we all know, a number of Python packages require libs that are used
> outside of python itself. These libs fall into (sort of) what I would
> call two catagories;
> 
> 1) general purpose libs used by multiple python packages: libpng, freetype, 
> etc.
> (I'm still confused on why Apple doesn't provide all these --
> particularly libpng -- what's up with that?)

In general Apple doesn't ship libraries that it doesn't use itself, and for 
image format support they have their own libraries (in particular the ImageIO 
framework).  

To move back onto topic, not relying on unix-level libraries in OSX is in a 
good thing as it makes it easier to support multiple OSX versions with a single 
set of binaries.


> 
> 2) More specific libs, likely only used by a single python package --
> netcdf, proj.4, etc.
> 
> Users also fall into two categories:
> 
> 1) Folks that do Python development on OS-X much like Linux, etc --
> these folks are likely to use macports or homebrew, or are used to the
> .configure, make, make install dance. We don't need to do anything to
> support these folks -- "pip install" generally works for them.

Except for a number of more complicated libraries (such as PIL/Pillow) when 
using universal binaries (when using 'pip install', homebrew/macports/... have 
their own mechanisms for building).

> 
> 2) folks that want to use a Mac like a Mac, and people that develop
> for those folks --  these people need binary installers, and may want
> to be able to use and deploy either packages or applications (Py2app)
> that will run on systems older than the one developed on, or want
> universal builds, or ???
>  - These are the folks I'd like to support, but I'm still unsure as
> to how best to do that.

It would be nice to have a set of binary "packages", based on a reproducable 
build system.

> 
> Way back when Bob Ippolito maintained a repository of binary packages
> for the mac -- it was a great resource, but he's long since moved on
> to other things.

The binary packages that Bob maintained had IMHO two major problems:

1) The largest problem is that the packages were AFAIK created ad-hoc (Bob or 
some other contributor did the magic incantations to build library dependencies)

2) The packages were Installer.app packages. The current state of the art for 
development/project environments is to use virtualenv or buildout to create 
separated python installations and install all project depedencies there 
instead of the global site-packages directory. That's not something that's 
easily supported with Installer.app packages.


> 
> We kind of get away with it because a number of major package
> maintainers have done a good job of providing binaries themselves
> (wxPython, numpy/scipy/matplotlib), but others fail to do so (PIL).
> Plus some of us have minor packages that we want to distribute.
> 
> I'd like to put together an archive, much like Bob's was, or like
> Christoph Gohlke's one for Windows
> (By the way -- that one is HUGE -- I have no idea how he keeps it up!
> http://www.lfd.uci.edu/~gohlke/pythonlibs/)
> 
> But with or without that archive, I still need to build the darn
> things. So now on to the question:
> 
> How should the dependencies be distributed?
> 
> 1) They should be built to match the Python binary being targeted
> (honestly, I think that's now only the Intel 32-64 bit ones -- PPC
> machines, and pre 10.6, are getting really rare...)

No objections there. Supporting PPC is getting way to hard now that
the Xcode version that you can install on recent machines cannot generate
PPC code.  I have VMs with older OSX releases for running the compiler,
but that's barely usable.

> 
> 2) Static or dynamic?
> 
> IIUC, most successful binary packages for the Mac have relied on
> statically linking the dependencies -- this works, and is pretty
> robust. However, it can be kind of a pain to do (though I've finally
> figure how to do it more reliably!). Also, it seems like a waste to me
> for packages that use common dependencies -- how many copies of libpng
> do I really want linked into my single instance of Python at run time?

As long as the libpng state isn't shared static linking isn't really a
problem. It does get a problem when two extensions use the same external
library and share state, for example the curses and curses_panel extensions
in the stdlib.

Dynamic linking has at least two disadvantages:

1) Relocation of the installation prefix is harder due to the way the dynamic 
linker on OSX looks for libraries: when extension foo.so uses library 
libfoo.dylib the absolute path of libfoo.dylib is included in the MachO header 
for foo.so. The header is easily updated using macholib, but that makes 
installation harder and isn't sup

[Pythonmac-SIG] Advice wanted on dependency building...

2013-05-22 Thread Chris Barker - NOAA Federal
Hey folks,

I'm looking for advice, and maybe even some consensus among the
MacPython community, on how to build and distribute packages with
non-python dependencies.

As we all know, a number of Python packages require libs that are used
outside of python itself. These libs fall into (sort of) what I would
call two catagories;

1) general purpose libs used by multiple python packages: libpng, freetype, etc.
 (I'm still confused on why Apple doesn't provide all these --
particularly libpng -- what's up with that?)

2) More specific libs, likely only used by a single python package --
netcdf, proj.4, etc.

Users also fall into two categories:

1) Folks that do Python development on OS-X much like Linux, etc --
these folks are likely to use macports or homebrew, or are used to the
.configure, make, make install dance. We don't need to do anything to
support these folks -- "pip install" generally works for them.

2) folks that want to use a Mac like a Mac, and people that develop
for those folks --  these people need binary installers, and may want
to be able to use and deploy either packages or applications (Py2app)
that will run on systems older than the one developed on, or want
universal builds, or ???
  - These are the folks I'd like to support, but I'm still unsure as
to how best to do that.

Way back when Bob Ippolito maintained a repository of binary packages
for the mac -- it was a great resource, but he's long since moved on
to other things.

We kind of get away with it because a number of major package
maintainers have done a good job of providing binaries themselves
(wxPython, numpy/scipy/matplotlib), but others fail to do so (PIL).
Plus some of us have minor packages that we want to distribute.

I'd like to put together an archive, much like Bob's was, or like
Christoph Gohlke's one for Windows
(By the way -- that one is HUGE -- I have no idea how he keeps it up!
http://www.lfd.uci.edu/~gohlke/pythonlibs/)

But with or without that archive, I still need to build the darn
things. So now on to the question:

How should the dependencies be distributed?

1) They should be built to match the Python binary being targeted
(honestly, I think that's now only the Intel 32-64 bit ones -- PPC
machines, and pre 10.6, are getting really rare...)

2) Static or dynamic?

IIUC, most successful binary packages for the Mac have relied on
statically linking the dependencies -- this works, and is pretty
robust. However, it can be kind of a pain to do (though I've finally
figure how to do it more reliably!). Also, it seems like a waste to me
for packages that use common dependencies -- how many copies of libpng
do I really want linked into my single instance of Python at run time?

But if dynamic, where do you put them? We'll still want to ship them
with the binary, so people have a one-click install. I don't think we
want to install them into a standard location, like /usr/local, as
that could stomp on something else the user is doing. So:
 - Do we put the in the Python Framework, say in:
/Library/Frameworks/Python.framework/Versions/2.7/lib
 - This make some sense, but then you couldn't share them between,
say, python 2.7 and 3.3 (and however many other versions...
  - Do we create a new Framework, say:
/Library/Frameworks/PythonDeps.framework and put them all there? But
this may confuse things for different deployment targets.

If we go one of these routes, would we have a separate installer for
the libs, and all the other installers would depend on that? Or would
each installer put a copy of the libs it needed into the common
location, maybe writing over one that's already there (which should be
OK -- it should be compatible, or have a different version number,
etc.)

Note that I've used the term "we" here ;-)  I'm hoping that others
will join me in following a convention and getting stuff out there,
but even if not, I'd love feedback on how best to do it.

By the way, the other goal is to build scripts that do the builds the
way we need for various libs, packages, etc, so that it's easy to do
it all when new builds are required...
(maybe use gattai? -- http://sourceforge.net/projects/gattai/)

Feedback please!!

-Chris

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Pythonmac-SIG maillist  -  Pythonmac-SIG@python.org
http://mail.python.org/mailman/listinfo/pythonmac-sig
unsubscribe: http://mail.python.org/mailman/options/Pythonmac-SIG