[Numpy-discussion] NumPy array from ctypes data

2007-02-26 Thread Andrew McMurry
How can I create a NumPy array which is backed by ctypes data?

ie create a NumPy array from a ctypes one without any data copying?

So that changing an element of the NumPy array will alter the  
underlying ctypes array.  I have C structures that contain numerical  
data and would like to access that data through the NumPy array  
interface.

I have tried the following:

PyObject *make_dbl_array(double *data, int n)
{
   npy_intp dim = n;
   return PyArray_SimpleNewFromData(1, dim, NPY_DOUBLE, (void *)data);
}

But when called via ctypes (with restype set to ctypes.py_object) it  
gives a bus error.

Andrew

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [Matplotlib-users] Unifying numpy, scipy, and matplotlib docstring formats

2007-02-26 Thread Alan G Isaac
On Sun, 25 Feb 2007, Jouni K. Seppänen apparently wrote: 
 it is easier to type something like 
   [0 1 0; 1 0 0; 0 0 1] 
 than 
   array([[0,1,0],[1,0,0],[0,0,1]]) 


x = numpy.mat('0 1 0; 1 0 0; 0 0 1').A

hth,
Alan Isaac




___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Christopher Barker
Zachary Pincus wrote:
 building python extensions as Mac-PPC and Mac-Intel fat  
 binaries, so I'm turning to the wisdom of this list for a few questions.

I'd try the pythonmac list too -- there are folks there that actually 
understand all this!

 My general goal is to make a double-clickable Mac installer of a set  
 of tools built around numpy, numpy's distutils, a very hacked-up  
 version of PIL, and some fortran code too. To this end, I need to  
 figure out how to get the numpy distutils to cross-compile,  
 generating PPC code and Intel code in separate builds -- and/or  
 generating a universal binary all in one go. (I'd like to distribute  
 a universal version of numpy, but I think that my own code needs to  
 be built/distributed separately for each architecture due to endian- 
 ness issues.)

hmm -- maybe you'd be better off dealing with the endian issues in your 
code -- i.e. dealing with it at runtime, rather than compile time.

 Is there explicit support in distutils for this, or is it a matter of  
 setting the proper environment variables to entice gcc and gfortran  
 to generate code for a specific architecture?

I'm no expert, but the glory of distutils is that it will, by default 
build extensions the same way as python itself was built. So if you use 
a PPC python, you'll get PPC extensions, same with Intel, and if you use 
  a Universal Python, you'll get a Universal extension.

The trick is that while you can build Universal on either platform, you 
can't use this trick to build Intel extensions on a PPC mac, as the 
Python would have to be intel, and a PPC mac won't run an Intel Python. 
It may be possible to run a PPC Python on an Intel Mac with Rosettta, 
though.

In any case, Universal is probably the best bet except for your Fortran 
code - non one has made a Fortan compiler that can do Universal. You may 
be able to build the two part independently an use Lipo to put them 
together, however.

Googling this list and the pythonmac one should get you some discussion 
of this, but AFAIK, no one has done it yet.

If you do need to have your Fortran stuff separate, you can still make 
the rest of it Universal

 One problem is that PIL is a tricky beast, even in the neutered form  
 that I'm using it. It does a compile-time check for the endian-ness  
 of the system, and a compile-time search for the zlib to use, both of  
 which are problematic.

Well, I know it's possible to build Universal. There are binaries on 
pythonmac.org/packages. The folks on the pythonmac list should be able 
to tell you how. ( zlib is included with OS-X, so that shouldn't be an 
issue)

 To address the former, I'd like to be able to (say) include something  
 like 'config_endian --big' on the 'python setup.py' command-line, and  
 have that information trickle down to the PIL config script (a few  
 subpackages deep). Is this easy or possible?

I doubt it, but there has got to be a way to tie endianess to platform. 
You'd want the Intel code built one way, and the PPC code another. I 
think distutils may take care of this for you.

Good luck! And if you find a way to build a universal Fortran extension 
-- be sure to let us know!

-Chris

-- 
Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

[EMAIL PROTECTED]
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Robert Kern
Christopher Barker wrote:

 I'm no expert, but the glory of distutils is that it will, by default 
 build extensions the same way as python itself was built. So if you use 
 a PPC python, you'll get PPC extensions, same with Intel, and if you use 
   a Universal Python, you'll get a Universal extension.

There is a little wrinkle in that numpy configures itself by compiling and
running small C programs to determine what is supported on its platform. When
building on an Intel machine even with a Universal Python, the results of that
configuration will only be for the system it was compiled on. Thus, even though
Universal binaries built on 10.4 systems would usually work on 10.3.9, numpy
doesn't.

 The trick is that while you can build Universal on either platform, you 
 can't use this trick to build Intel extensions on a PPC mac, as the 
 Python would have to be intel, and a PPC mac won't run an Intel Python. 
 It may be possible to run a PPC Python on an Intel Mac with Rosettta, 
 though.
 
 In any case, Universal is probably the best bet except for your Fortran 
 code - non one has made a Fortan compiler that can do Universal.

The R folks have a package containing gcc 4.0.3 with gfortran that looks like it
might be Universal. I haven't tried to build scipy with it, yet.

-- 
Robert Kern

I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth.
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Christopher Barker
Robert Kern wrote:
 even though
 Universal binaries built on 10.4 systems would usually work on 10.3.9, numpy
 doesn't.

Darn, but I for one, can live without 10.3.9 support -- it does build 
Universal properly for 10.4 doesn't it?

 The R folks have a package containing gcc 4.0.3 with gfortran that looks like 
 it
 might be Universal. I haven't tried to build scipy with it, yet.

cool!

-Chris



-- 
Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/ORR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

[EMAIL PROTECTED]
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Advice on masked array implementation

2007-02-26 Thread Fred Clare
We would like some advice on how to proceed with implementing
masked array capabilities in a large package of climate-related
analysis functions.  We are in the initial stages of trying to
duplicate functionality from an existing package written in a
locally-developed scripting language.  The existing functionality
depends heavily on masked arrays (actually on variables with
attributes, with fill_value being one such).

We have polled our user base and, while all responders plan to
convert to NumPy, many have not started the conversion or are in
transition.  It is our experience that converting users to
new ways is usually a multi-year undertaking, so it seems
that we may need to support both Numeric and NumPy installations
for some time to come.  Even if people have converted to the
NumPy version of our package, they may still be importing
packages that have not been converted.

Our initial design attempts to deal with the possibility
of users potentially mixing (or using exclusively) Numeric
arrays, Numeric masked arrays, NumPy arrays, and NumPy masked
arrays.  For example, suppose you have a function:

result = func(arg0, arg1)

where the two arguments and return variable can be any one of
the four types of arrays mentioned.  Currently we are testing
to see if either argument is a NumPy or Numeric masked array.
If just Numeric masked arrays, then we return a Numeric masked
array.  If just NumPy masked arrays, then we return a NumPy
masked array.  If one is a Numeric masked array and the other
is a NumPy masked array, then we return a NumPy masked array.
Similar checking is done for using just Numeric and/or NumPy
non-masked arrays.  Does this seem like a reasonable approach?

It is tempting just to go with NumPy, but then we will have a
large class of users who cannot access the new functionality.

We have followed the discussion on the development of the new
maskedarray module, but have not used it.  I went to

http://projects.scipy.org/scipy/numpy/attachment/wiki/MaskedArray/ 
maskedarray.py

as referenced in a posting from Pierre GM, but I got Internal Error.

Has there been any decision as to whether maskedarray will be
in NumPy version 1.1?  Any estimate as to when 1.1 would be out?
If we commit to numpy.core.ma now, how much trouble will it be
to convert to the new maskedarray?  Is there any user documentation
on maskedarray and details on the differences between it and
numpy.core.ma?

Thanks,

Fred Clare

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Questions about cross-compiling extensions for mac-ppc and mac-intel

2007-02-26 Thread Robert Kern
Christopher Barker wrote:
 Robert Kern wrote:
 even though
 Universal binaries built on 10.4 systems would usually work on 10.3.9, numpy
 doesn't.
 
 Darn, but I for one, can live without 10.3.9 support -- it does build 
 Universal properly for 10.4 doesn't it?

I've never tested it.

-- 
Robert Kern

I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth.
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] How to tell numpy to use gfortran as a compiler ?

2007-02-26 Thread David Cournapeau
Robert Kern wrote:
 David Cournapeau wrote:
   
 Sturla Molden wrote:
 
 g77 is a Fortran 77 compiler. The development of g77 is halted.

 gfortran is a Fortran 77, 90, and 95 compiler. It is the current Fortran
 compiler in the GNU Compiler Collection (GCC).


 You can compile the reference implementation of BLAS and LAPACK with both
 g77 and gfortran, as these libraries are written in Fortran 77. ATLAS is
 written in C and some Fortran 77.

 gfortran are able to do some optimizations that g77 cannot, e.g.
 autovectorization using SSE and MMX extensions and profile-guided
 optimizations. Also be aware that if you use gfortran and GCC 4, the C
 compiler is better as well.

   
 Ok, that clears things you, thank you. Now, I have to understand why 
 --fcompiler=gnu95 still calls gfortran
 

 You mean g77? Anyways, I think I know why you are having problems. Passing
 --fcompiler to the config command only affects the Fortran compiler that is 
 used
 during configuration phase (where we compile small C programs to determine 
 what
 your platform supports, like isnan() and the like). It does not propagate to 
 the
 rest of the build_ext phase where you want it. Use config_fc to set up your
 Fortran compiler for all of the phases:

   $ python setup.py config_fc --fcompiler=gnu95 build
   
Thanks, that indeed was the cause of my problem.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] nan, warning and error modes

2007-02-26 Thread David Cournapeau
Hi,

I am developing some numpy code, which sometimes fail because of 
nan. This is likely to be due to some bad coding on my side, and as such 
any NaN is a bug for this particular piece of code.
Is there a way to get a warning when the first Nan is detected in 
the code (or even a faulty assertion) ? It looks like there are some 
variables/functions related to that in numpy, but I didn't find any 
useful document on the matter, either in the numpy book nor on the scipy 
website.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] nan, warning and error modes

2007-02-26 Thread Robert Kern
David Cournapeau wrote:
 Hi,
 
 I am developing some numpy code, which sometimes fail because of 
 nan. This is likely to be due to some bad coding on my side, and as such 
 any NaN is a bug for this particular piece of code.
 Is there a way to get a warning when the first Nan is detected in 
 the code (or even a faulty assertion) ? It looks like there are some 
 variables/functions related to that in numpy, but I didn't find any 
 useful document on the matter, either in the numpy book nor on the scipy 
 website.

seterr(invalid='warning')

Described in section 9.1.4 Error Handling of the _Guide to NumPy_ (dated
2006-12-07).

-- 
Robert Kern

I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth.
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] nan, warning and error modes

2007-02-26 Thread Robert Kern
On 2/27/07, Robert Kern [EMAIL PROTECTED] wrote:
 David Cournapeau wrote:
  Hi,
 
  I am developing some numpy code, which sometimes fail because of
  nan. This is likely to be due to some bad coding on my side, and as such
  any NaN is a bug for this particular piece of code.
  Is there a way to get a warning when the first Nan is detected in
  the code (or even a faulty assertion) ? It looks like there are some
  variables/functions related to that in numpy, but I didn't find any
  useful document on the matter, either in the numpy book nor on the scipy
  website.

 seterr(invalid='warning')

Ahem.

  seterr(invalid='warn')

-- 
Robert Kern

I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth.
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion