Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Christopher Barker
Zachary Pincus wrote:
 building python extensions as Mac-PPC and Mac-Intel fat  
 binaries, so I'm turning to the wisdom of this list for a few questions.

I'd try the pythonmac list too -- there are folks there that actually 
understand all this!

 My general goal is to make a double-clickable Mac installer of a set  
 of tools built around numpy, numpy's distutils, a very hacked-up  
 version of PIL, and some fortran code too. To this end, I need to  
 figure out how to get the numpy distutils to cross-compile,  
 generating PPC code and Intel code in separate builds -- and/or  
 generating a universal binary all in one go. (I'd like to distribute  
 a universal version of numpy, but I think that my own code needs to  
 be built/distributed separately for each architecture due to endian- 
 ness issues.)

hmm -- maybe you'd be better off dealing with the endian issues in your 
code -- i.e. dealing with it at runtime, rather than compile time.

 Is there explicit support in distutils for this, or is it a matter of  
 setting the proper environment variables to entice gcc and gfortran  
 to generate code for a specific architecture?

I'm no expert, but the glory of distutils is that it will, by default 
build extensions the same way as python itself was built. So if you use 
a PPC python, you'll get PPC extensions, same with Intel, and if you use 
  a Universal Python, you'll get a Universal extension.

The trick is that while you can build Universal on either platform, you 
can't use this trick to build Intel extensions on a PPC mac, as the 
Python would have to be intel, and a PPC mac won't run an Intel Python. 
It may be possible to run a PPC Python on an Intel Mac with Rosettta, 
though.

In any case, Universal is probably the best bet except for your Fortran 
code - non one has made a Fortan compiler that can do Universal. You may 
be able to build the two part independently an use Lipo to put them 
together, however.

Googling this list and the pythonmac one should get you some discussion 
of this, but AFAIK, no one has done it yet.

If you do need to have your Fortran stuff separate, you can still make 
the rest of it Universal

 One problem is that PIL is a tricky beast, even in the neutered form  
 that I'm using it. It does a compile-time check for the endian-ness  
 of the system, and a compile-time search for the zlib to use, both of  
 which are problematic.

Well, I know it's possible to build Universal. There are binaries on 
pythonmac.org/packages. The folks on the pythonmac list should be able 
to tell you how. ( zlib is included with OS-X, so that shouldn't be an 
issue)

 To address the former, I'd like to be able to (say) include something  
 like 'config_endian --big' on the 'python setup.py' command-line, and  
 have that information trickle down to the PIL config script (a few  
 subpackages deep). Is this easy or possible?

I doubt it, but there has got to be a way to tie endianess to platform. 
You'd want the Intel code built one way, and the PPC code another. I 
think distutils may take care of this for you.

Good luck! And if you find a way to build a universal Fortran extension 
-- be sure to let us know!

-Chris

-- 
Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

[EMAIL PROTECTED]
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Robert Kern
Christopher Barker wrote:

 I'm no expert, but the glory of distutils is that it will, by default 
 build extensions the same way as python itself was built. So if you use 
 a PPC python, you'll get PPC extensions, same with Intel, and if you use 
   a Universal Python, you'll get a Universal extension.

There is a little wrinkle in that numpy configures itself by compiling and
running small C programs to determine what is supported on its platform. When
building on an Intel machine even with a Universal Python, the results of that
configuration will only be for the system it was compiled on. Thus, even though
Universal binaries built on 10.4 systems would usually work on 10.3.9, numpy
doesn't.

 The trick is that while you can build Universal on either platform, you 
 can't use this trick to build Intel extensions on a PPC mac, as the 
 Python would have to be intel, and a PPC mac won't run an Intel Python. 
 It may be possible to run a PPC Python on an Intel Mac with Rosettta, 
 though.
 
 In any case, Universal is probably the best bet except for your Fortran 
 code - non one has made a Fortan compiler that can do Universal.

The R folks have a package containing gcc 4.0.3 with gfortran that looks like it
might be Universal. I haven't tried to build scipy with it, yet.

-- 
Robert Kern

I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth.
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Christopher Barker
Robert Kern wrote:
 even though
 Universal binaries built on 10.4 systems would usually work on 10.3.9, numpy
 doesn't.

Darn, but I for one, can live without 10.3.9 support -- it does build 
Universal properly for 10.4 doesn't it?

 The R folks have a package containing gcc 4.0.3 with gfortran that looks like 
 it
 might be Universal. I haven't tried to build scipy with it, yet.

cool!

-Chris



-- 
Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

[EMAIL PROTECTED]
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Robert Kern
Christopher Barker wrote:
 Robert Kern wrote:
 even though
 Universal binaries built on 10.4 systems would usually work on 10.3.9, numpy
 doesn't.
 
 Darn, but I for one, can live without 10.3.9 support -- it does build 
 Universal properly for 10.4 doesn't it?

I've never tested it.

-- 
Robert Kern

I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth.
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-24 Thread Zachary Pincus
Hi folks,

I've been doing a lot of web-reading on the subject, but have not  
been completely able to synthesize all of the disparate bits of  
advice about building python extensions as Mac-PPC and Mac-Intel fat  
binaries, so I'm turning to the wisdom of this list for a few questions.

My general goal is to make a double-clickable Mac installer of a set  
of tools built around numpy, numpy's distutils, a very hacked-up  
version of PIL, and some fortran code too. To this end, I need to  
figure out how to get the numpy distutils to cross-compile,  
generating PPC code and Intel code in separate builds -- and/or  
generating a universal binary all in one go. (I'd like to distribute  
a universal version of numpy, but I think that my own code needs to  
be built/distributed separately for each architecture due to endian- 
ness issues.)

Is there explicit support in distutils for this, or is it a matter of  
setting the proper environment variables to entice gcc and gfortran  
to generate code for a specific architecture?

One problem is that PIL is a tricky beast, even in the neutered form  
that I'm using it. It does a compile-time check for the endian-ness  
of the system, and a compile-time search for the zlib to use, both of  
which are problematic.

To address the former, I'd like to be able to (say) include something  
like 'config_endian --big' on the 'python setup.py' command-line, and  
have that information trickle down to the PIL config script (a few  
subpackages deep). Is this easy or possible?

To address the latter, I think I need to have the PIL extensions  
dynamically link against '/Developer/SDKs/MacOSX10.4u.sdk/usr/lib/ 
libz.dylib' which is the fat-binary version of the library, using the  
headers from '/Developer/SDKs/MacOSX10.4u.sdk/usr/include/zlib.h
'. Right now, PIL is using system_info from numpy.distutils to find  
the valid library paths on which libz and its headers might live.  
This is nice and more or less platform-neutral, which I like. How  
best should I convince/configure numpy.distutils.system_info to put '/ 
Developer/SDKs/MacOSX10.4u.sdk/usr/{lib,include}' on the output to  
get_include_dirs() and get_lib_dirs()?

Thanks for any advice or counsel,

Zach Pincus

Program in Biomedical Informatics and Department of Biochemistry
Stanford University School of Medicine

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion