Re: [Numpy-discussion] [Announce] Numpy 1.3.0 rc1

2009-03-30 Thread David Cournapeau
David Cournapeau wrote:
> On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle  wrote:
>
>   
>> I just installed 2.5.4 from python.org, and the OS X installer still
>> doesn't work.  This is on a PPC G5; I haven't tried it on my Intel
>> MacBook Pro.
>> 

Could you try this one ?

http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy/numpy-1.3.0rc1-py2.5-macosx10.5.mpkg.tbz2

If it does not work, getting the /var/tmp/install.log would be helpful
(the few last lines),

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] SWIG and numpy.i

2009-03-30 Thread Kevin Françoisse
Hello Bill,

Finaly,  I just change my function header to take a double* rather than a
double**.  It's working fine now.  Thank you for all your answer and your
help!

Swig and numpy.i are really cool when you now how to use it!

I also use INPLACE array as a way output 2D arrays from my C function.

Kevin

On Wed, Mar 25, 2009 at 3:03 PM, Bill Spotz  wrote:

> Kevin,
>
> In this instance, the best thing is to write a wrapper function that calls
> your matSum() function,  and takes a double* rather than a double**.  You
> can %ignore the original function and %rename the wrapper so that the python
> interface gets the name you want.
>
>
> On Mar 25, 2009, at 7:39 AM, Kevin Françoisse wrote:
>
>  Thanks Bill, it helps me a lot ! My function works fine now.
>>
>> But I encounter an other problem. This time with a NumPy array of 2
>> dimensions.
>>
>> Here is the function I want to use :
>>
>> //
>> double matSum(double** mat, int n, int m){
>>int i,j;
>>double sum = 0.0;
>>for (i=0;i>for (j=0;j>sum += mat[i][j];
>>}
>>}
>>return sum;
>> }
>> //
>>
>> I supposed that the typemaps to use is the following :
>>
>> %apply (double* IN_ARRAY2, int DIM1, int DIM2) {(double** mat, int n, int
>> m)};
>>
>> But it is not working. Of course, my typemaps assignement is not
>> compatible with my function parameters. I tried several ways of using a two
>> dimensional array but I'm not sure what is the best way to do it ?
>>
>> Thanks
>>
>> ---
>> Kevin Françoisse
>> Ph.D. at Machine Learning Group at UCL
>> Belgium
>> kevin.francoi...@uclouvain.be
>>
>> On Tue, Mar 24, 2009 at 6:13 PM, Bill Spotz  wrote:
>> Kevin,
>>
>> You need to declare vecSum() *after* you %include "numpy.i" and use the
>> %apply directive.  Based on what you have, I think you can just get rid of
>> the "extern double vecSum(...)".  I don't see what purpose it serves.  As
>> is, it is telling swig to wrap vecSum() before you have set up your numpy
>> typemaps.
>>
>>
>> On Mar 24, 2009, at 10:33 AM, Kevin Françoisse wrote:
>>
>> Hi everyone,
>>
>> I have been using NumPy for a couple of month now, as part of my research
>> project at the university.  But now, I have to use a big C library I wrote
>> myself in a python project.  So I choose to use SWIG for the interface
>> between both my python script and my C library. To make things more
>> comprehensible, I wrote a small C methods that illustrate my problem:
>>
>> /* matrix.c */
>>
>> #include 
>> #include 
>> /* Compute the sum of a vector of reals */
>> double vecSum(int* vec,int m){
>>  int  i;
>>  double sum =0.0;
>>
>>  for(i=0;i>  sum += vec[i];
>>  }
>>  return sum;
>> }
>>
>> /***/
>>
>> /* matrix.h */
>>
>> double vecSum(int* vec,int m);
>>
>> /***/
>>
>> /* matrix.i */
>>
>> %module matrix
>> %{
>> #define SWIG_FILE_WITH_INIT
>> #include "matrix.h"
>> %}
>>
>> extern double vecSum(int* vec, int m);
>>
>> %include "numpy.i"
>>
>> %init %{
>> import_array();
>> %}
>>
>> %apply (int* IN_ARRAY1, int DIM1) {(int* vec, int m)};
>> %include "matrix.h"
>>
>> /***/
>>
>> I'm using a python script to compile my swig interface and my C files
>> (running Mac OS X 10.5)
>>
>> /* matrixSetup.py */
>>
>> from distutils.core import setup, Extension
>> import numpy
>>
>> setup(name='matrix', version='1.0', ext_modules =[Extension('_matrix',
>> ['matrix.c','matrix.i'],
>>   include_dirs = [numpy.get_include(),'.'])])
>>
>> /***/
>>
>> Everything seems to work fine ! But when I test my wrapped module in
>> python with an small NumPy array, here what I get :
>>
>> >>> import matrix
>> >>> from numpy import *
>> >>> a = arange(10)
>> >>> matrix.vecSum(a,a.shape[0])
>> Traceback (most recent call last):
>>  File "", line 1, in 
>> TypeError: in method 'vecSum', argument 1 of type 'int *'
>>
>> How can I tell SWIG that my Integer NumPy array should represent a int*
>> array in C ?
>>
>> Thank you very much,
>>
>> Kevin
>> 
>>
>> ** Bill Spotz  **
>> ** Sandia National Laboratories  Voice: (505)845-0170  **
>> ** P.O. Box 5800 Fax:   (505)284-0154  **
>> ** Albuquerque, NM 87185-0370Email: wfsp...@sandia.gov **
>>
>>
>>
>>
>>
>>
>>
>>
> ** Bill Spotz  **
> ** Sandia National Laboratories  Voice: (505)845-0170  **
> ** P.O. Box 5800 Fax:   (505)284-0154  **
> ** Albuquerque, NM 87185-0370Email: wfsp...@sandia.gov **
>
>
>
>
>
>
>
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [Announce] Numpy 1.3.0 rc1

2009-03-30 Thread David Cournapeau
Peter wrote:
> On Sat, Mar 28, 2009 at 2:26 PM, David Cournapeau
>  wrote:
>   
>> Hi,
>>
>> I am pleased to announce the release of the rc1 for numpy
>> 1.3.0. You can find source tarballs and installers for both Mac OS X
>> and Windows on the sourceforge page:
>>
>> https://sourceforge.net/projects/numpy/
>> 
>
> For the beta release, I can see both
> numpy-1.3.0b1-win32-superpack-python2.5.exe and
> numpy-1.3.0b1-win32-superpack-python2.6.exe
>
> However, for the first release candidate I can only see
> numpy-1.3.0rc1-win32-superpack-python2.5.exe - no Python 2.6 version.

I uploaded it but forgot to update it on the sourceforge download page.
This should be fixed,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Failure with 1.3.0b1 under Solaris 10 SPARC

2009-03-30 Thread Jeff Blaine
> What version of glibc do you have?

None.  Solaris does not use GNU libc.

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [Announce] Numpy 1.3.0 rc1

2009-03-30 Thread Robert Cimrman
Hi,

It might be too late (I was off-line last week), but anyway:
I have set the milestone for the ticket 1036 [1] to 1.4, but it does not 
change the existing functionality, brings some new one, and the tests 
pass, so I wonder if it could get it into the 1.3 release?

cheers,
r.

[1] http://projects.scipy.org/numpy/ticket/1036

David Cournapeau wrote:
> Hi,
> 
> I am pleased to announce the release of the rc1 for numpy
> 1.3.0. You can find source tarballs and installers for both Mac OS X
> and Windows on the sourceforge page:
> 
> https://sourceforge.net/projects/numpy/
> 
> 
> The release note for the 1.3.0 release are below,
> 
> The Numpy developers
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] np.savez not multi-processing safe, alternatives?

2009-03-30 Thread Wes McKinney
I have a process that stores a number of sets of 3 arrays output which can
either be stored as a few .npy files or an .npz file with the same keys in
each file (let's say, writing roughly 10,000 npz files, all containing the
same keys 'a', 'b', 'c'). If I run multiple processes on the same machine
(desirable, since they heavily database-IO-bound), over a period of hours
some of the npz-writes will collide and fail due to the use of tempfile and
tempfile.gettempdir() (either one of the .npy subfiles will be locked for
writing or will get os.remove'd while the zip file is being written).

So my question-- recommendations for a way around this, or possible to
change the savez function to make it less likely to happen? (I am on Win32)

Thanks,
Wes
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [Announce] Numpy 1.3.0 rc1

2009-03-30 Thread Peter
On Sat, Mar 28, 2009 at 2:26 PM, David Cournapeau
 wrote:
>
> Hi,
>
> I am pleased to announce the release of the rc1 for numpy
> 1.3.0. You can find source tarballs and installers for both Mac OS X
> and Windows on the sourceforge page:
>
> https://sourceforge.net/projects/numpy/

For the beta release, I can see both
numpy-1.3.0b1-win32-superpack-python2.5.exe and
numpy-1.3.0b1-win32-superpack-python2.6.exe

However, for the first release candidate I can only see
numpy-1.3.0rc1-win32-superpack-python2.5.exe - no Python 2.6 version.
Is this an oversight, or maybe some caching issue with the sourceforge
mirror system?  In the meantime I'll give the beta a go on Python 2.6
on my Windows XP machine...

Thanks,

Peter
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Failure with 1.3.0b1 under Solaris 10 SPARC

2009-03-30 Thread Jeff Blaine
FWIW, I solved this just now by removing Sun Studio from
my PATH before build.  It's clear that's a workaround
though and the build process failed to determine something
properly.

Jeff Blaine wrote:
>> What version of glibc do you have?
> 
> None.  Solaris does not use GNU libc.
> 
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
> 


___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread Robert Pyle
Hi David,

I decided to change the Subject line to be more apropos.

On Mar 30, 2009, at 3:41 AM, David Cournapeau wrote:

> David Cournapeau wrote:
>> On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle  
>>  wrote:
>>
>>
>>> I just installed 2.5.4 from python.org, and the OS X installer still
>>> doesn't work.  This is on a PPC G5; I haven't tried it on my Intel
>>> MacBook Pro.
>>>
>
> Could you try this one ?
>
> http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy/numpy-1.3.0rc1-py2.5-macosx10.5.mpkg.tbz2

This one installs, but only in /Library/Python/2.5/site-packages/,  
that is, for Apple's system python.  This happened when `which python`  
pointed to either EPD python or python.org's 2.5.4.

> If it does not work, getting the /var/tmp/install.log would be helpful
> (the few last lines),

> /var/tmp/ had a bunch of stuff in it, but no file named  
> install.log.  Perhaps that's because the installation succeeded?

Bob



___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread David Cournapeau
On Mon, Mar 30, 2009 at 11:06 PM, Robert Pyle  wrote:
> Hi David,
>
> I decided to change the Subject line to be more apropos.
>
> On Mar 30, 2009, at 3:41 AM, David Cournapeau wrote:
>
>> David Cournapeau wrote:
>>> On Mon, Mar 30, 2009 at 3:36 AM, Robert Pyle
>>>  wrote:
>>>
>>>
 I just installed 2.5.4 from python.org, and the OS X installer still
 doesn't work.  This is on a PPC G5; I haven't tried it on my Intel
 MacBook Pro.

>>
>> Could you try this one ?
>>
>> http://www.ar.media.kyoto-u.ac.jp/members/david/archives/numpy/numpy-1.3.0rc1-py2.5-macosx10.5.mpkg.tbz2
>
> This one installs, but only in /Library/Python/2.5/site-packages/,
> that is, for Apple's system python.  This happened when `which python`
> pointed to either EPD python or python.org's 2.5.4.

Yes, what your default python is does not matter: I don't know the
details, but it looks like the mac os x installer only looks whether a
python binary exists in /System/Library/..., that is the one I used to
build the package. You can see this in the Info.plist inside the
.mpkg.


>> If it does not work, getting the /var/tmp/install.log would be helpful
>> (the few last lines),
>
>> /var/tmp/ had a bunch of stuff in it, but no file named
>> install.log.  Perhaps that's because the installation succeeded?

It it because I mistyped the path... logs are in /var/log/install.log.
I tried to find a way to at least print something about missing
requirement, but it does not look like there is a lot of documentation
out there on the apple installer.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread Chris Barker
David Cournapeau wrote:
> On Mon, Mar 30, 2009 at 11:06 PM, Robert Pyle  wrote:
>> This one installs, but only in /Library/Python/2.5/site-packages/,
>> that is, for Apple's system python.  This happened when `which python`
>> pointed to either EPD python or python.org's 2.5.4.
> 
> Yes, what your default python is does not matter: I don't know the
> details, but it looks like the mac os x installer only looks whether a
> python binary exists in /System/Library/..., that is the one I used to
> build the package. You can see this in the Info.plist inside the
> .mpkg.


Well, this is the big question: what python(s) should be provide 
binaries for -- I think if you're only going to do one, it should be the 
python.org build, so that you can support 10.4, and 10.5 and everyone 
can use it.

There are ways to build an installer that puts it in a place that both 
can find it -- wxPython does this -- but I'm not so sure that's a good idea.

One of the key questions is how one should think of Apple's Python. They 
are using it for some system tools, so we really shouldn't break it. If 
you upgrade the numpy it comes with, there is some chance that you could 
break something.

Also, Apple has not (and likely won't) upgrade their Python. I know I 
happened to run into a bug and needed a newer 2.5, so I'd rather have 
the control.

A few years ago the MacPython community (as represented by the members 
of the pythonmac list) decided that the python.org build was that one 
that we should all target for binaries. That consensus has weakened with 
10.5, as Apple did provide a Python that is fairly up to date and almost 
fully functional, but I think it's still a lot easier on everyone if we 
just stick with the python.org build as the one to target for binaries.

That being said, it shouldn't be hard to build separate binaries for 
each python -- they would be identical except for where they get 
installed, and if they are clearly marked for downloading, there 
shouldn't be too much confusion.

-Chris


___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread David Cournapeau
On Tue, Mar 31, 2009 at 12:51 AM, Chris Barker  wrote:

> Well, this is the big question: what python(s) should be provide
> binaries for -- I think if you're only going to do one, it should be the
> python.org build, so that you can support 10.4, and 10.5 and everyone
> can use it.

I don't really care, as long as there is only one. Maintaing binaries
for every python out there is too time consuming. Given that mac os x
is the easiest platform to build numpy/scipy on, that's not something
i am interested in.

> There are ways to build an installer that puts it in a place that both
> can find it -- wxPython does this -- but I'm not so sure that's a good idea.

there is the problem of compatibility. I am not sure whether Apple
python and python.org are ABI compatible - even if the version is the
same, you can certainly build incompatible python (I don't know if
that's the case on mac os).


> Also, Apple has not (and likely won't) upgrade their Python. I know I
> happened to run into a bug and needed a newer 2.5, so I'd rather have
> the control.

That's a rather convincing argument. I will thus build binaries
against python.org binaries (I still have to find a way to guarantee
this in the build script, but that should not be too difficult).

> That being said, it shouldn't be hard to build separate binaries for
> each python -- they would be identical except for where they get
> installed, and if they are clearly marked for downloading, there
> shouldn't be too much confusion.

My experience is that every choice presented to the user makes for
more problem. And that just takes too much time. I prefer spending
time making a few good installers rather than many half baked.
Ideally, we should have something which could install on every python
version, but oh well,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] JOB: write numpy docs

2009-03-30 Thread Joe Harrington
Last year's Doc Marathon got us off to a great start on documenting
NumPy!  But, there's still much work to be done, and SciPy after that.
It's time to gear up for doing it again.

Critical to last year's success was Stefan van der Walt's committed
time, but he will be unable to play that role this year.  So, I am
looking to hire someone to write NumPy docs and help coordinate the
doc project and its volunteers.  The job includes working with me, the
doc team, doc volunteers, and developers to:

write and review a lot of docs, mainly those that others don't want to write
help define milestones
organize campaigns and volunteer teams to meet them
research the NumPy and SciPy source codes to help plan:
  the eventual SciPy documentation
  the writing of a good User Manual
work with the packaging team to meet their release deadlines
perform other duties as assigned

I am seeking someone to work full time if possible, and at least half
time, from mid-April (or soon thereafter) through at least the
(northern) summer.  Candidates must be experienced NumPy and SciPy
programmers; familiarity under the hood is a strong plus.  They must
also demonstrate their ability to produce excellent docs on the
docs.SciPy.org wiki.  Having contributed at a high level to an
open-source community, especially to SciPy, is a big plus.  Ability to
take direction, work with and lead a team, and to work for extended
periods without direct supervision on a list of assigned tasks are all
critical.  The applicant must be able to function well in a Linux
environment; familiarity with multiple platforms is a plus.

Please reply directly to me by email only.  Include the following (PDF
or ASCII formats strongly preferred):

CV

Statement of interest, qualifications per requirements above,
availability, and wage expectations.

Contact info for at least 3 professional references.

Links to doc wiki pages for which you wrote the initial draft
Links to doc wiki pages started by others to which you contributed
  significantly (edited, reviewed, proofed)

The position is open until filled; candidates with complete
applications by April 15 will receive full consideration.  This is an
open posting.  Candidates who have not written any pages on the doc
wiki yet have several weeks in which to do so.

Pay will be commensurate with experience (up to a point).  Relocation
is not necessary.  Candidates will need to provide their own computer
and internet access.  The University of Central Florida is an equal
opportunity, equal access, affirmative action employer.

--jh--
Prof. Joseph Harrington
Department of Physics
MAP 414
4000 Central Florida Blvd.
University of Central Florida
Orlando, FL 32816-2385
(407) 823-3416 voice
(407) 823-5112 fax
(407) 823-2325 physics office
j...@physics.ucf.edu
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Numpy 1.3.0 rc1 fails find_duplicates on Solaris

2009-03-30 Thread Mark Sienkiewicz
Numpy 1.3.0 rc1 fails this self-test on Solaris.


==
FAIL: Test find_duplicates
--
Traceback (most recent call last):
  File "/usr/ra/pyssg/2.5.1/numpy/lib/tests/test_recfunctions.py", line 
163, in test_find_duplicates
assert_equal(test[0], a[control])
  File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 121, in 
assert_equal
return assert_array_equal(actual, desired, err_msg)
  File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 193, in 
assert_array_equal
header='Arrays are not equal')
  File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 186, in 
assert_array_compare
verbose=verbose, header=header)
  File "/usr/stsci/pyssgdev/2.5.1/numpy/testing/utils.py", line 395, in 
assert_array_compare
raise AssertionError(msg)
AssertionError:
Arrays are not equal

(mismatch 50.0%)
 x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 
'B'))],
  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
 y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0, 
'B'))],
  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])

--



The software I am using:

NumPy version 1.3.0rc1
Python version 2.5.1 (r251:54863, Jun  4 2008, 15:48:19) [C]
nose version 0.10.4


I think this identifies the compilers it was built with:

customize SunFCompiler
Found executable /opt/SUNWspro-6u2/bin/f90
Could not locate executable echo ranlib
customize SunFCompiler
customize SunFCompiler using config
C compiler: cc -DNDEBUG -O -xcode=pic32



It passes in Python 2.5.1 on these machines:
x86 macintosh, 32 bit
Red Hat Enterprise 4 Linux, x86, 32 bit
RHE 3, x86, 32 bit
RHE 4, x86, 64 bit
PowerPC mac, 32 bit  (Yes, even the PPC mac.)


I see that this is the same problem as 
http://projects.scipy.org/numpy/ticket/1039 but the data used in the 
test is different.

Mark S.

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread Chris Barker
David Cournapeau wrote:
> I don't really care, as long as there is only one. Maintaining binaries
> for every python out there is too time consuming. Given that mac os X
> is the easiest platform to build numpy/scipy on,

I assume you meant NOT the easiest? ;-)

> that's not something i am interested in.

quite understandable.

>> There are ways to build an installer that puts it in a place that both
>> can find it -- wxPython does this -- but I'm not so sure that's a good idea.
> 
> there is the problem of compatibility. I am not sure whether Apple
> python and python.org are ABI compatible

In theory, yes, and in practice, it seems to be working for wxPython. 
However, I agree that it's a bit risky. I'm at the PyCon MacPython 
sprint as we type -- and apparently Apple's is linked with the 10.5 sdk, 
whereas python.org's is linked against the 10.3 sdk -- so there could be 
  issues.

> I will thus build binaries
> against python.org binaries (I still have to find a way to guarantee
> this in the build script, but that should not be too difficult).

Hardcoding the path to python should work:

PYTHON=/Library/Frameworks/Python.framework/Versions/2.5/bin/python


> My experience is that every choice presented to the user makes for
> more problem. And that just takes too much time. I prefer spending
> time making a few good installers rather than many half baked.

I agree -- and most packages I use seem to supporting python.org 
exclusively for binaries.

> Ideally, we should have something which could install on every python
> version, but oh well,

well, I guess that's the promise of easy_install -- but someone would 
have to build all the binary eggs... and there were weird issues with 
universal eggs on the mac that I understand have been fixed in 2.6, but 
not 2.5

Thanks for all your work on this,

-Chris

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread David Cournapeau
On Tue, Mar 31, 2009 at 2:10 AM, Chris Barker  wrote:
> David Cournapeau wrote:
>> I don't really care, as long as there is only one. Maintaining binaries
>> for every python out there is too time consuming. Given that mac os X
>> is the easiest platform to build numpy/scipy on,
>
> I assume you meant NOT the easiest? ;-)

Actually, no, I meant it :) It has gcc, which is the best supported
compiler by numpy and scipy, there is almost no problem with g77, and
the optimized blas/lapack is provided by the OS vendor, meaning on ABI
issue, weird atlas build errors, etc... It is almost impossible to get
the build wrong on mac os x once you get the right fortran compiler.


>
> In theory, yes, and in practice, it seems to be working for wxPython.
> However, I agree that it's a bit risky. I'm at the PyCon MacPython
> sprint as we type -- and apparently Apple's is linked with the 10.5 sdk,
> whereas python.org's is linked against the 10.3 sdk -- so there could be
>  issues.

I am almost certain there are issues in some configurations, in
particular x86_64. I don't know the details, but I have seen mentioned
several time this kind of problems:

http://osdir.com/ml/python-dev/2009-02/msg00339.html

I can see how this could cause trouble.

>
>> I will thus build binaries
>> against python.org binaries (I still have to find a way to guarantee
>> this in the build script, but that should not be too difficult).
>
> Hardcoding the path to python should work:
>
> PYTHON=/Library/Frameworks/Python.framework/Versions/2.5/bin/python

Well, yes, but you can't really control this in the bdist_mpkg
command. Also, my current paver file uses virtualenv to build a
isolated numpy - that's what breaks the .mpkg, but I like this
approach for building, so I would like to keep it as much as possible.

> well, I guess that's the promise of easy_install -- but someone would
> have to build all the binary eggs... and there were weird issues with
> universal eggs on the mac that I understand have been fixed in 2.6, but
> not 2.5

There are numerous problems with eggs (or more precisely, with "easy"
install), which I am just not interested in getting into. In
particular, it often breaks the user system - fixing it is easy for
developers/"power users", but is a PITA for normal users. As long as
easy_install is broken, I don't want to use it.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Optical autocorrelation calculated with numpy is slow

2009-03-30 Thread João Luís Silva
Hi,

I wrote a script to calculate the *optical* autocorrelation of an 
electric field. It's like the autocorrelation, but sums the fields 
instead of multiplying them. I'm calculating

I(tau) = integral( abs(E(t)+E(t-tau))**2,t=-inf..inf)

with script appended at the end. It's too slow for my purposes (takes ~5 
seconds, and scales ~O(N**2)). numpy's correlate is fast enough, but 
isn't what I need as it multiplies instead of add the fields. Could you 
help me get this script to run faster (without having to write it in 
another programming language) ?

Thanks,
João Silva

#

import numpy as np
#import matplotlib.pyplot as plt

n = 2**12
n_autocorr = 3*n-2

c = 3E2
w0 = 2.0*np.pi*c/800.0
t_max = 100.0
t = np.linspace(-t_max/2.0,t_max/2.0,n)

E = np.exp(-(t/10.0)**2)*np.exp(1j*w0*t)#Electric field

dt = t[1]-t[0]
t_autocorr=np.linspace(-dt*n_autocorr/2.0,dt*n_autocorr/2.0,n_autocorr)
E1 = np.zeros(n_autocorr,dtype=E.dtype)
E2 = np.zeros(n_autocorr,dtype=E.dtype)
Ac = np.zeros(n_autocorr,dtype=np.float64)

E2[n-1:n-1+n] = E[:]

for i in range(2*n-2):
 E1[:] = 0.0
 E1[i:i+n] = E[:]

 Ac[i] = np.sum(np.abs(E1+E2)**2)

Ac *= dt

#plt.plot(t_autocorr,Ac)
#plt.show()

#

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Optical autocorrelation calculated with numpy is slow

2009-03-30 Thread Anne Archibald
2009/3/30 João Luís Silva :
> Hi,
>
> I wrote a script to calculate the *optical* autocorrelation of an
> electric field. It's like the autocorrelation, but sums the fields
> instead of multiplying them. I'm calculating
>
> I(tau) = integral( abs(E(t)+E(t-tau))**2,t=-inf..inf)

You may be in trouble if there's cancellation, but can't you just
rewrite this as E(t)**2+E(t-tau)**2-2*E(t)*E(t-tau)? Then you have two
O(n) integrals and one standard autocorrelation...

Anne

> with script appended at the end. It's too slow for my purposes (takes ~5
> seconds, and scales ~O(N**2)). numpy's correlate is fast enough, but
> isn't what I need as it multiplies instead of add the fields. Could you
> help me get this script to run faster (without having to write it in
> another programming language) ?
>
> Thanks,
> João Silva
>
> #
>
> import numpy as np
> #import matplotlib.pyplot as plt
>
> n = 2**12
> n_autocorr = 3*n-2
>
> c = 3E2
> w0 = 2.0*np.pi*c/800.0
> t_max = 100.0
> t = np.linspace(-t_max/2.0,t_max/2.0,n)
>
> E = np.exp(-(t/10.0)**2)*np.exp(1j*w0*t)    #Electric field
>
> dt = t[1]-t[0]
> t_autocorr=np.linspace(-dt*n_autocorr/2.0,dt*n_autocorr/2.0,n_autocorr)
> E1 = np.zeros(n_autocorr,dtype=E.dtype)
> E2 = np.zeros(n_autocorr,dtype=E.dtype)
> Ac = np.zeros(n_autocorr,dtype=np.float64)
>
> E2[n-1:n-1+n] = E[:]
>
> for i in range(2*n-2):
>     E1[:] = 0.0
>     E1[i:i+n] = E[:]
>
>     Ac[i] = np.sum(np.abs(E1+E2)**2)
>
> Ac *= dt
>
> #plt.plot(t_autocorr,Ac)
> #plt.show()
>
> #
>
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Optical autocorrelation calculated with numpy is slow

2009-03-30 Thread Charles R Harris
On Mon, Mar 30, 2009 at 12:23 PM, Anne Archibald
wrote:

> 2009/3/30 João Luís Silva :
> > Hi,
> >
> > I wrote a script to calculate the *optical* autocorrelation of an
> > electric field. It's like the autocorrelation, but sums the fields
> > instead of multiplying them. I'm calculating
> >
> > I(tau) = integral( abs(E(t)+E(t-tau))**2,t=-inf..inf)
>
> You may be in trouble if there's cancellation, but can't you just
> rewrite this as E(t)**2+E(t-tau)**2-2*E(t)*E(t-tau)? Then you have two
> O(n) integrals and one standard autocorrelation...
>

That should work. The first two integrals are actually the same, but need to
be E(t)*E(t).conj(). The second integral needs twice the real part of
E(t)*E(t-tau).conj(). Numpy correlate should really have the conjugate built
in, but it doesn't.

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread Chris Barker
David Cournapeau wrote:
> On Tue, Mar 31, 2009 at 2:10 AM, Chris Barker  wrote:
>> I assume you meant NOT the easiest? ;-)
> 
> Actually, no, I meant it :) It has gcc, which is the best supported
> compiler by numpy and scipy, there is almost no problem with g77, and
> the optimized blas/lapack is provided by the OS vendor, meaning on ABI
> issue, weird atlas build errors, etc... It is almost impossible to get
> the build wrong on mac os x once you get the right fortran compiler.

I see -- well that's good news. I've found the Universal library 
requirements to be a pain sometimes, and it probably would be here if 
Apple wasn't giving us lapack/blas.

> I am almost certain there are issues in some configurations, in
> particular x86_64.

Well, neither Apple nor python.org's builds are 64 bit anyway at this 
point. There is talk of quad (i386,and ppc_64 i86_64) builds the the 
future, though.

>>> I will thus build binaries
>>> against python.org binaries (I still have to find a way to guarantee
>>> this in the build script, but that should not be too difficult).
>> Hardcoding the path to python should work:
>>
>> PYTHON=/Library/Frameworks/Python.framework/Versions/2.5/bin/python
> 
> Well, yes, but you can't really control this in the bdist_mpkg
> command.

bdist_mpkg should do "the right thing" if it's run with the right 
python. So you need to make sure you run:

/Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg

Rather than whatever one happens to be found on your PATH.


> Also, my current paver file uses virtualenv to build a
> isolated numpy - that's what breaks the .mpkg, but I like this
> approach for building, so I would like to keep it as much as possible.

Well, maybe we need to hack bdist_mpkg to support this, we're pretty 
sure that it is possible.

I want o make sure I understand what you want:

Do you want to be able to build numpy in a virtualenv, and then build a 
mpkg that will install into the users regular Framework?

Do you want to be able to build a mpkg that users can install into the 
virtualenv of their choice?

Both?


Of course, easy_install can do that, when it works!

> There are numerous problems with eggs (or more precisely, with "easy"
> install), which I am just not interested in getting into.

me neither --

> In
> particular, it often breaks the user system - fixing it is easy for
> developers/"power users", but is a PITA for normal users. As long as
> easy_install is broken, I don't want to use it.

We were just talking about some of that last night -- we really need a 
"easy_uninstall" for instance.

I'm going to poke into bdist_mpkg a bit right now.

By the way, for the libgfortran issue, while statically linking it may 
be the best option, it wouldn't be too hard to have the mpkg include and 
install /usr/local/lib/ligfortran.dylib  (or whatever).

-Chris

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] np.savez not multi-processing safe, alternatives?

2009-03-30 Thread Pauli Virtanen
Mon, 30 Mar 2009 09:03:56 -0400, Wes McKinney wrote:
> I have a process that stores a number of sets of 3 arrays output which
> can either be stored as a few .npy files or an .npz file with the same
> keys in each file (let's say, writing roughly 10,000 npz files, all
> containing the same keys 'a', 'b', 'c'). If I run multiple processes on
> the same machine (desirable, since they heavily database-IO-bound), over
> a period of hours some of the npz-writes will collide and fail due to
> the use of tempfile and tempfile.gettempdir() (either one of the .npy
> subfiles will be locked for writing or will get os.remove'd while the
> zip file is being written).

This is bug #852, it's fixed in trunk. As a workaround for the present, 
you may want to grab the `savez` function from

http://projects.scipy.org/numpy/browser/trunk/numpy/lib/io.py#L243

and use a copy of it in your code temporarily. The function is fairly 
small.

-- 
Pauli Virtanen

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 fails find_duplicates on Solaris

2009-03-30 Thread Charles R Harris
On Mon, Mar 30, 2009 at 10:28 AM, Mark Sienkiewicz wrote:

> Numpy 1.3.0 rc1 fails this self-test on Solaris.
>
>
> ==
> FAIL: Test find_duplicates
> --
> Traceback (most recent call last):
>  File "/usr/ra/pyssg/2.5.1/numpy/lib/tests/test_recfunctions.py", line
> 163, in test_find_duplicates
>assert_equal(test[0], a[control])
>  File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 121, in
> assert_equal
>return assert_array_equal(actual, desired, err_msg)
>  File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 193, in
> assert_array_equal
>header='Arrays are not equal')
>  File "/usr/stsci/pyssgdev/2.5.1/numpy/ma/testutils.py", line 186, in
> assert_array_compare
>verbose=verbose, header=header)
>  File "/usr/stsci/pyssgdev/2.5.1/numpy/testing/utils.py", line 395, in
> assert_array_compare
>raise AssertionError(msg)
> AssertionError:
> Arrays are not equal
>
> (mismatch 50.0%)
>  x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
> 'B'))],
>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
>  y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
> 'B'))],
>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
>
> --
>
>
>
> The software I am using:
>
> NumPy version 1.3.0rc1
> Python version 2.5.1 (r251:54863, Jun  4 2008, 15:48:19) [C]
> nose version 0.10.4
>

These are new (two months old) tests. Hmm, they are also marked as known
failures on win32. I wonder why they fail there and not on linux? I think
you should open a ticket for this.

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 fails find_duplicates on Solaris

2009-03-30 Thread Mark Sienkiewicz

>> ==
>> FAIL: Test find_duplicates
>> --
>> ...
>> AssertionError:
>> Arrays are not equal
>>
>> (mismatch 50.0%)
>>  x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
>> 'B'))],
>>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
>>  y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
>> 'B'))],
>>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
>>
>> --
>>
>> 
>
> These are new (two months old) tests. Hmm, they are also marked as known
> failures on win32. I wonder why they fail there and not on linux? I think
> you should open a ticket for this.
>   

I'm not sure how old the test is, but I see that it has been failing 
since Feb 1.  (That is the earliest report I have online at the moment.)

The ticket is http://projects.scipy.org/numpy/ticket/1039 .  I added 
this specific failure mode to the ticket today.

It does not surprise me at all when the trunk is broken on solaris.  I'm 
mentioning it on the list because I see it is still broken in the 
release candidate.  I assume somebody would want to either fix the 
problem or remove the non-working feature from the release.

Mark S.

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 fails find_duplicates on Solaris

2009-03-30 Thread Pauli Virtanen
Mon, 30 Mar 2009 14:03:17 -0600, Charles R Harris wrote:

> On Mon, Mar 30, 2009 at 10:28 AM, Mark Sienkiewicz
> wrote:
> 
>> Numpy 1.3.0 rc1 fails this self-test on Solaris.
[clip]
>> ==
>> FAIL: Test find_duplicates
>> --
>> assert_equal(test[0], a[control])
>>
>>  x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
>> 'B'))],
>>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
>>  y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
>> 'B'))],
>>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
>
> These are new (two months old) tests. Hmm, they are also marked as known
> failures on win32. I wonder why they fail there and not on linux? I
> think you should open a ticket for this.

The data seems to be in a different order in the index array and in the 
data array returned by `find_duplicates`.

It is intended that find_duplicates guarantees that the returned indices 
correspond to the returned values?

Another question: the 'recfunctions' is not imported anywhere in numpy?

(BTW, it might be good not to keep commented-out code such as those 
np.knownfail decorators in the repository, unless it's explained why it's 
commented out...)

-- 
Pauli Virtanen

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 fails find_duplicates on Solaris

2009-03-30 Thread Charles R Harris
On Mon, Mar 30, 2009 at 3:04 PM, Mark Sienkiewicz wrote:

>
> >> ==
> >> FAIL: Test find_duplicates
> >> --
> >> ...
> >> AssertionError:
> >> Arrays are not equal
> >>
> >> (mismatch 50.0%)
> >>  x: array([(1, (2.0, 'B')), (2, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
> >> 'B'))],
> >>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
> >>  y: array([(2, (2.0, 'B')), (1, (2.0, 'B')), (2, (2.0, 'B')), (1, (2.0,
> >> 'B'))],
> >>  dtype=[('A', '>i4'), ('B', [('BA', '>f8'), ('BB', '|S1')])])
> >>
> >> --
> >>
> >>
> >
> > These are new (two months old) tests. Hmm, they are also marked as known
> > failures on win32. I wonder why they fail there and not on linux? I think
> > you should open a ticket for this.
> >
>
> I'm not sure how old the test is, but I see that it has been failing
> since Feb 1.  (That is the earliest report I have online at the moment.)
>
> The ticket is http://projects.scipy.org/numpy/ticket/1039 .  I added
> this specific failure mode to the ticket today.
>
> It does not surprise me at all when the trunk is broken on solaris.  I'm
> mentioning it on the list because I see it is still broken in the
> release candidate.  I assume somebody would want to either fix the
> problem or remove the non-working feature from the release.
>

I'm guessing that it is the test that needs fixing. And maybe the windows
problem is related.

Chuck
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] DVCS at PyCon

2009-03-30 Thread Bruce Southey
David Cournapeau wrote:
> Hi Travis,
>
> On Sat, Mar 28, 2009 at 11:54 PM, Travis E. Oliphant
>  wrote:
>   
>> FYI from PyCon
>>
>> Here at PyCon, it has been said that Python will be moving towards DVCS
>> and will be using bzr or mecurial, but explicitly *not* git.   It would
>> seem that *git* got the "lowest" score in the Developer survey that
>> Brett Cannon did.
>> 
>
> It is interesting how those tools are viewed so differently in
> different communities. I am too quite doubtful about the validity of
> those surveys :)
>
>   
>> The reasons seem to be:
>>
>>  * git doesn't have good Windows clients
>> 
>
> Depending on what is meant by good windows client (GUI, IDE
> integration), it is true, but then neither do bzr or hg have good
> clients, so I find this statement a bit strange. What is certainly
> true is that git developers care much less about windows than bzr (and
> hg ?). For example, I would guess git will never care much about case
> insensitive fs, etc... (I know bzr developers worked quite a bit on
> this).
>
>   
>>  * git is not written with Python
>> 
>
> I can somewhat understand why it matters to python, but does it matter to us ?
>
> There are definitely strong arguments against git - but I don't think
> being written in python is a strong one. The lack of a good window
> support is a good argument against changing from svn, but very
> unconvincing compared to other tools. Git has now so much more
> manpower compared to hg and bzr (many more project use it: the list of
> visible projects using git is becoming quite impressive) - from a 3rd
> party POV, I think git is much better set up than bzr and hg. Gnome
> choosing git could be significant (they made the decision a couple of
> days ago).
>
>   
>> I think the sample size was pretty small to be making decisions on
>> (especially when most opinions where "un-informed").
>> 
>
> Most people just choose the one they first use. Few people know
> several DVCS. Pauli and me started a page about arguments pro/cons git
> - it is still very much work in progress:
>
> http://projects.scipy.org/numpy/wiki/GitMigrationProposal
>
> Since few people are willing to try different systems, we also started
> a few workflows (compared to svn):
>
> http://projects.scipy.org/numpy/wiki/GitWorkflow
>
> FWIW, I have spent some time to look at converting svn repo to git,
> with proper conversion of branches, tags, and other things. I have
> converted my own scikits to git as a first trial (I have numpy
> converted as well, but I did not put it anywhere to avoid confusion).
> This part of the problem would be relatively simple to handle.
>
> cheers,
>
> David
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>   
It is now official that Python will switch to Mercurial (Hg):
http://thread.gmane.org/gmane.comp.python.devel/102706

Not that it directly concerns me, but this is rather surprising given:
http://www.python.org/dev/peps/pep-0374/

Hopefully more details will be provided in the near future.

Bruce
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 fails find_duplicates on Solaris

2009-03-30 Thread Pauli Virtanen
Mon, 30 Mar 2009 15:15:06 -0600, Charles R Harris wrote:
> I'm guessing that it is the test that needs fixing. And maybe the windows
> problem is related.

Probably they are both related to unspecified sort order for
the duplicates. There were some sort-order ignoring missing in the test.

I think the test is now fixed in trunk:

http://projects.scipy.org/numpy/changeset/6827

-- 
Pauli Virtanen

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] DVCS at PyCon

2009-03-30 Thread Alan G Isaac
On 3/30/2009 5:16 PM Bruce Southey apparently wrote:
> It is now official that Python will switch to Mercurial (Hg):
> http://thread.gmane.org/gmane.comp.python.devel/102706
> 
> Not that it directly concerns me, but this is rather surprising given:
> http://www.python.org/dev/peps/pep-0374/


http://www.python.org/dev/peps/pep-0374/#chosen-dvcs

;-)
Alan Isaac

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] lost with slicing

2009-03-30 Thread Partridge, Matthew BGI SYD
 
I apologise if I'm asking an obvious question or one that has already
been addressed.

I've tried to understand the documentation in the numpy manual on
slicing, but I'm a bit lost.  I'm trying to do indexing using both
slices and index lists.  I have a problem when I do something like:

x[0, :, [0,1,2]]

Here are a couple of examples:

>>> a = numpy.arange(6).reshape(2,3)
>>> print a
[[0 1 2]
 [3 4 5]]
>>> print a[:, [0,1,2]]   # example 1 - this works as I expected
[[0 1 2]
 [3 4 5]]
>>> b = numpy.arange(6).reshape(1,2,3)
>>> print b
[[[0 1 2]
  [3 4 5]]]
>>> print b[0, :, [0,1,2]]  # example 2 - this seems to be the transpose
of what I was expecting
[[0 3]
 [1 4]
 [2 5]]
>>> print b[0, [[0],[1]], [[0,1,2]]] # example 3 - this is what I
expected
[[0 1 2]
 [3 4 5]]

Am I doing something wrong?  Why do we get different behaviour in
example 2 compared with example 1 or example 3?

(I'm using numpy 1.0.3.1 on python 2.4.1 for windows, but I've tried
some more recent versions of numpy as well.)

mattp

 
--
 
This message and any attachments are confidential, proprietary, and may be 
privileged. If this message was misdirected, Barclays Global Investors (BGI) 
does not waive any confidentiality or privilege. If you are not the intended 
recipient, please notify us immediately and destroy the message without 
disclosing its contents to anyone. Any distribution, use or copying of this 
e-mail or the information it contains by other than an intended recipient is 
unauthorized. The views and opinions expressed in this e-mail message are the 
author's own and may not reflect the views and opinions of BGI, unless the 
author is authorized by BGI to express such views or opinions on its behalf. 
All email sent to or from this address is subject to electronic storage and 
review by BGI. Although BGI operates anti-virus programs, it does not accept 
responsibility for any damage whatsoever caused by viruses being passed.
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] lost with slicing

2009-03-30 Thread josef . pktd
On Mon, Mar 30, 2009 at 7:54 PM, Partridge, Matthew BGI SYD
 wrote:
>
> I apologise if I'm asking an obvious question or one that has already
> been addressed.
>
> I've tried to understand the documentation in the numpy manual on
> slicing, but I'm a bit lost.  I'm trying to do indexing using both
> slices and index lists.  I have a problem when I do something like:
>
> x[0, :, [0,1,2]]
>
> Here are a couple of examples:
>
 a = numpy.arange(6).reshape(2,3)
 print a
> [[0 1 2]
>  [3 4 5]]
 print a[:, [0,1,2]]   # example 1 - this works as I expected
> [[0 1 2]
>  [3 4 5]]
 b = numpy.arange(6).reshape(1,2,3)
 print b
> [[[0 1 2]
>  [3 4 5]]]
 print b[0, :, [0,1,2]]  # example 2 - this seems to be the transpose
> of what I was expecting
> [[0 3]
>  [1 4]
>  [2 5]]
 print b[0, [[0],[1]], [[0,1,2]]] # example 3 - this is what I
> expected
> [[0 1 2]
>  [3 4 5]]
>
> Am I doing something wrong?  Why do we get different behaviour in
> example 2 compared with example 1 or example 3?
>
> (I'm using numpy 1.0.3.1 on python 2.4.1 for windows, but I've tried
> some more recent versions of numpy as well.)
>
> mattp
>

that's how it works, whether we like it or not.

 see thread with title "is it a bug?" starting  march 11

Josef
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] lost with slicing

2009-03-30 Thread Partridge, Matthew BGI SYD
 
> > I apologise if I'm asking an obvious question or one that 
> has already 
> > been addressed.
> >
> > I've tried to understand the documentation in the numpy manual on 
> > slicing, but I'm a bit lost.  I'm trying to do indexing using both 
> > slices and index lists.  I have a problem when I do something like:
> >
> > x[0, :, [0,1,2]]
> >
> > Here are a couple of examples:
> >
>  a = numpy.arange(6).reshape(2,3)
>  print a
> > [[0 1 2]
> >  [3 4 5]]
>  print a[:, [0,1,2]]   # example 1 - this works as I expected
> > [[0 1 2]
> >  [3 4 5]]
>  b = numpy.arange(6).reshape(1,2,3)
>  print b
> > [[[0 1 2]
> >  [3 4 5]]]
>  print b[0, :, [0,1,2]]  # example 2 - this seems to be the 
> >  transpose of what I was expecting
> > [[0 3]
> >  [1 4]
> >  [2 5]]
>  print b[0, [[0],[1]], [[0,1,2]]] # example 3 - this is what I
> >   expected
> > [[0 1 2]
> >  [3 4 5]]
> >
> > Am I doing something wrong?  Why do we get different behaviour in 
> > example 2 compared with example 1 or example 3?
> >
> > (I'm using numpy 1.0.3.1 on python 2.4.1 for windows, but I've tried 
> > some more recent versions of numpy as well.)
> >
> > mattp
> >
> 
> that's how it works, whether we like it or not.
> 
>  see thread with title "is it a bug?" starting  march 11
> 
> Josef

Thanks Josef,

I've looked over "is it a bug" thread, and realise that it is very relevant!
But I'm still lost.  Robert Kern wrote:

  "It's certainly weird, but it's working as designed. Fancy indexing via
  arrays is a separate subsystem from indexing via slices. Basically,
  fancy indexing decides the outermost shape of the result (e.g. the
  leftmost items in the shape tuple). If there are any sliced axes, they
  are *appended* to the end of that shape tuple."

I see that's the case in example 2, but not in example 1 (above).  Josef, I also
see your example doesn't fit this explanation:

  >>> x = np.arange(30).reshape(3,5,2)
  >>> idx = np.array([0,1]); e = x[:,[0,1],0]; e.shape
  (3, 2)
  >>> idx = np.array([0,1]); e = x[:,:2,0]; e.shape
  (3, 2)

Travis Oliphant wrote:

  Referencing my previous post on this topic.   In this case, it is 
  unambiguous to replace dimensions 1 and 2 with the result of 
  broadcasting idx and idx together.   Thus the (5,6) dimensions is 
  replaced by the (2,) result of indexing leaving the outer dimensions 
  in-tact,  thus (4,2,7) is the result.

I'm unclear on when something is regarded as "unambiguous"; I don't really get 
how the rules work.

I'm trying to build something where I can do (for "a" having a shape 
(n1,n2,n3,...)):

a[i1, i2, i3, ...]

where i1, i2, i3 can be
* a single index:   eg a[3]
* a slice:  eg a[:3]
* a list of keys:   eg a[[1,2,3]]
and the interpretation of this should yield:
* no corresponding dimension if a single index is used
* a dimension of length of the slice if a slice is used
* a dimension of length of the list if a list is used

I currently apply the following logic:
* look through the index coordinates that are being applied
* if there are multiple list-of-key indices, then reshape them so that they 
will broadcast to agree:
   a[[1,2,3], [4,5]] --> a[[[1],[2],[3]], [[4,5]]]
* note if there are any slices.  If so, I assume (as per Robert Kern's remark) 
that the dimensions corresponding to the slices are going to be appended to the 
end.  So I make sure that I transpose my result at the end to correct for this.

When I do all this, I get example 2 behaving like example 3, but example 1 then 
doesn't work.  I'm not trying to get the discussion list to do my work for me, 
but I'm pretty confused as to when dimensions get swapped and when they don't; 
when something is "ambiguous" and when it is "unambiguous".

Any help appreciated,
thanks,
matt

 
--
 
This message and any attachments are confidential, proprietary, and may be 
privileged. If this message was misdirected, Barclays Global Investors (BGI) 
does not waive any confidentiality or privilege. If you are not the intended 
recipient, please notify us immediately and destroy the message without 
disclosing its contents to anyone. Any distribution, use or copying of this 
e-mail or the information it contains by other than an intended recipient is 
unauthorized. The views and opinions expressed in this e-mail message are the 
author's own and may not reflect the views and opinions of BGI, unless the 
author is authorized by BGI to express such views or opinions on its behalf. 
All email sent to or from this address is subject to electronic storage and 
review by BGI. Although BGI operates anti-virus programs, it does not accept 
responsibility for any damage whatsoever caused by viruses being passed.
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] lost with slicing

2009-03-30 Thread Robert Kern
On Mon, Mar 30, 2009 at 20:29, Partridge, Matthew BGI SYD
 wrote:
> Thanks Josef,
>
> I've looked over "is it a bug" thread, and realise that it is very relevant!
> But I'm still lost.  Robert Kern wrote:
>
>  "It's certainly weird, but it's working as designed. Fancy indexing via
>  arrays is a separate subsystem from indexing via slices. Basically,
>  fancy indexing decides the outermost shape of the result (e.g. the
>  leftmost items in the shape tuple). If there are any sliced axes, they
>  are *appended* to the end of that shape tuple."

I was wrong. Don't listen to me. Travis's explanation is what you need.

> I see that's the case in example 2, but not in example 1 (above).  Josef, I 
> also
> see your example doesn't fit this explanation:
>
>  >>> x = np.arange(30).reshape(3,5,2)
>  >>> idx = np.array([0,1]); e = x[:,[0,1],0]; e.shape
>  (3, 2)
>  >>> idx = np.array([0,1]); e = x[:,:2,0]; e.shape
>  (3, 2)
>
> Travis Oliphant wrote:
>
>  Referencing my previous post on this topic.   In this case, it is
>  unambiguous to replace dimensions 1 and 2 with the result of
>  broadcasting idx and idx together.   Thus the (5,6) dimensions is
>  replaced by the (2,) result of indexing leaving the outer dimensions
>  in-tact,  thus (4,2,7) is the result.
>
> I'm unclear on when something is regarded as "unambiguous"; I don't really 
> get how the rules work.

When a slice is all the way on the left or right, but not in the middle.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
  -- Umberto Eco
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] lost with slicing

2009-03-30 Thread Partridge, Matthew BGI SYD
 
Sorry group.  I found Travis Oliphant's earlier 12 March post (that
didn't show up in the same thread), and found the answer to my question.

matt 

> > > I apologise if I'm asking an obvious question or one that
> > has already
> > > been addressed.
> > >
> > > I've tried to understand the documentation in the numpy manual on 
> > > slicing, but I'm a bit lost.  I'm trying to do indexing 
> using both 
> > > slices and index lists.  I have a problem when I do 
> something like:
> > >
> > > x[0, :, [0,1,2]]
> > >
> > > Here are a couple of examples:
> > >
> >  a = numpy.arange(6).reshape(2,3)
> >  print a
> > > [[0 1 2]
> > >  [3 4 5]]
> >  print a[:, [0,1,2]]   # example 1 - this works as I expected
> > > [[0 1 2]
> > >  [3 4 5]]
> >  b = numpy.arange(6).reshape(1,2,3) print b
> > > [[[0 1 2]
> > >  [3 4 5]]]
> >  print b[0, :, [0,1,2]]  # example 2 - this seems to be the
> > >  transpose of what I was 
> expecting [[0 
> > > 3]
> > >  [1 4]
> > >  [2 5]]
> >  print b[0, [[0],[1]], [[0,1,2]]] # example 3 - this is what I
> > >   expected [[0 1 2]
> > >  [3 4 5]]
> > >
> > > Am I doing something wrong?  Why do we get different behaviour in 
> > > example 2 compared with example 1 or example 3?
> > >
> > > (I'm using numpy 1.0.3.1 on python 2.4.1 for windows, but 
> I've tried 
> > > some more recent versions of numpy as well.)
> > >
> > > mattp
> > >
> > 
> > that's how it works, whether we like it or not.
> > 
> >  see thread with title "is it a bug?" starting  march 11
> > 
> > Josef
> 
> Thanks Josef,
> 
> I've looked over "is it a bug" thread, and realise that it is 
> very relevant!
> But I'm still lost.  Robert Kern wrote:
> 
>   "It's certainly weird, but it's working as designed. Fancy 
> indexing via
>   arrays is a separate subsystem from indexing via slices. Basically,
>   fancy indexing decides the outermost shape of the result (e.g. the
>   leftmost items in the shape tuple). If there are any sliced 
> axes, they
>   are *appended* to the end of that shape tuple."
> 
> I see that's the case in example 2, but not in example 1 
> (above).  Josef, I also see your example doesn't fit this explanation:
> 
>   >>> x = np.arange(30).reshape(3,5,2)
>   >>> idx = np.array([0,1]); e = x[:,[0,1],0]; e.shape
>   (3, 2)
>   >>> idx = np.array([0,1]); e = x[:,:2,0]; e.shape
>   (3, 2)
> 
> Travis Oliphant wrote:
> 
>   Referencing my previous post on this topic.   In this case, it is 
>   unambiguous to replace dimensions 1 and 2 with the result of 
>   broadcasting idx and idx together.   Thus the (5,6) dimensions is 
>   replaced by the (2,) result of indexing leaving the outer dimensions
>   in-tact,  thus (4,2,7) is the result.
> 
> I'm unclear on when something is regarded as "unambiguous"; I 
> don't really get how the rules work.
> 
> I'm trying to build something where I can do (for "a" having 
> a shape (n1,n2,n3,...)):
> 
> a[i1, i2, i3, ...]
> 
> where i1, i2, i3 can be
> * a single index:   eg a[3]
> * a slice:  eg a[:3]
> * a list of keys:   eg a[[1,2,3]]
> and the interpretation of this should yield:
> * no corresponding dimension if a single index is used
> * a dimension of length of the slice if a slice is used
> * a dimension of length of the list if a list is used
> 
> I currently apply the following logic:
> * look through the index coordinates that are being applied
> * if there are multiple list-of-key indices, then reshape 
> them so that they will broadcast to agree:
>a[[1,2,3], [4,5]] --> a[[[1],[2],[3]], [[4,5]]]
> * note if there are any slices.  If so, I assume (as per 
> Robert Kern's remark) that the dimensions corresponding to 
> the slices are going to be appended to the end.  So I make 
> sure that I transpose my result at the end to correct for this.
> 
> When I do all this, I get example 2 behaving like example 3, 
> but example 1 then doesn't work.  I'm not trying to get the 
> discussion list to do my work for me, but I'm pretty confused 
> as to when dimensions get swapped and when they don't; when 
> something is "ambiguous" and when it is "unambiguous".
> 
> Any help appreciated,
> thanks,
> matt

 
--
 
This message and any attachments are confidential, proprietary, and may be 
privileged. If this message was misdirected, Barclays Global Investors (BGI) 
does not waive any confidentiality or privilege. If you are not the intended 
recipient, please notify us immediately and destroy the message without 
disclosing its contents to anyone. Any distribution, use or copying of this 
e-mail or the information it contains by other than an intended recipient is 
unauthorized. The views and opinions expressed in this e-mail message are the 
author's own and may not reflect the views and opinions of BGI, unless the 
author is authorized by BGI to express such views or opinions on its behalf. 
All email sent to or from this address is subject to electronic storage and 
review by

Re: [Numpy-discussion] DVCS at PyCon

2009-03-30 Thread David Cournapeau
On Tue, Mar 31, 2009 at 6:16 AM, Bruce Southey  wrote:
> It is now official that Python will switch to Mercurial (Hg):
> http://thread.gmane.org/gmane.comp.python.devel/102706
>
> Not that it directly concerns me, but this is rather surprising given:
> http://www.python.org/dev/peps/pep-0374/

I don't think it is: as Guido said in his email, someone has to make
the decision, and endless discussion go nowhere, because you can
always make arguments for one or the other. Since some core developers
are strongly against git (Martin Loewis for example), and given that
hg is used by several core python developers already, I think it makes
sense.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] A module for homogeneous transformation matrices, Euler angles and quaternions

2009-03-30 Thread cgohlke
Hello,

I have reimplemented many functions of the transformations.py module
in a C extension module. Speed improvements are 5-50 times.




-- Christoph

On Mar 4, 8:28 pm, Jonathan Taylor 
wrote:
> Looks cool but a lot of this should be done in an extension module to
> make it fast.  Perhaps starting this process off as a separate entity
> until stability is acheived.  I would be tempted to do some of this
> using cython.  I just wrote found that generating a rotation matrix
> from euler angles is about 10x faster when done properly with cython.
>
> J.
>
> On Wed, Mar 4, 2009 at 5:10 PM, Gareth Elston
>
>  wrote:
> > I found a nice module for these transforms at
> >http://www.lfd.uci.edu/~gohlke/code/transformations.py.html. I've
> > been using an older version for some time and thought it might make a
> > good addition to numpy/scipy. I made some simple mods to the older
> > version to add a couple of functions I needed and to allow it to be
> > used with Python 2.4.
>
> > The module is pure Python (2.5, with numpy 1.2 imported), includes
> > doctests, and is BSD licensed. Here's the first part of the module
> > docstring:
>
> > """Homogeneous Transformation Matrices and Quaternions.
>
> > A library for calculating 4x4 matrices for translating, rotating, mirroring,
> > scaling, shearing, projecting, orthogonalizing, and superimposing arrays of
> > homogenous coordinates as well as for converting between rotation matrices,
> > Euler angles, and quaternions.
> > """
>
> > I'd like to see this added to numpy/scipy so I know I've got some
> > reading to do (scipy.org/Developer_Zone and the huge scipy-dev
> > discussions on Scipy development infrastructure / workflow) to make
> > sure it follows the guidelines, but where would people like to see
> > this? In numpy? scipy? scikits? elsewhere?
>
> > I seem to remember that there was a first draft of a guide for
> > developers being written. Are there any links available?
>
> > Thanks,
> > Gareth.
> > ___
> > Numpy-discussion mailing list
> > numpy-discuss...@scipy.org
> >http://projects.scipy.org/mailman/listinfo/numpy-discussion
>
> ___
> Numpy-discussion mailing list
> numpy-discuss...@scipy.orghttp://projects.scipy.org/mailman/listinfo/numpy-discussion
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy 1.3.0 rc1 OS X Installer

2009-03-30 Thread David Cournapeau
Chris Barker wrote:
>
> I see -- well that's good news. I've found the Universal library 
> requirements to be a pain sometimes, and it probably would be here if 
> Apple wasn't giving us lapack/blas.
>   

Yes, definitely. I could see a lot of trouble if people had to build a
universal ATLAS :)

>
> Well, neither Apple nor python.org's builds are 64 bit anyway at this 
> point. There is talk of quad (i386,and ppc_64 i86_64) builds the the 
> future, though.
>   

Yes, but that's something that has to should be supported sooner rather
than later.

>
> bdist_mpkg should do "the right thing" if it's run with the right 
> python. So you need to make sure you run:
>
> /Library/Frameworks/Python.framework/Versions/2.5/bin/bdist_mpkg
>
> Rather than whatever one happens to be found on your PATH.
>   

Yes, that's the problem: this cannot work directly if I use virtual env,
since virtual env works by recreating a 'fake' python somewhere else.

> Well, maybe we need to hack bdist_mpkg to support this, we're pretty 
> sure that it is possible.
>
> I want o make sure I understand what you want:
>
> Do you want to be able to build numpy in a virtualenv, and then build a 
> mpkg that will install into the users regular Framework?
>   

Yes  - more exactly, there should be a way to guarantee that if I create
a virtual env from a given python interpreter, I can target a .mpkg to
this python interpreter.

> Do you want to be able to build a mpkg that users can install into the 
> virtualenv of their choice?
>   

No - virtualenv is only an artefact of the build process - users should
not care or even know I use virtualenv. I use virtualenv as a fast,
poor-man's 'python chroot'. This way, I can build and install python in
a directory with minimum interaction with the outside environment.
Installing is necessary to build the doc correctly, and I don't want to
mess my system with setuptools stuff.

> Of course, easy_install can do that, when it works!
>   

Except when it doesn't :)

> We were just talking about some of that last night -- we really need a 
> "easy_uninstall" for instance.
>   

yes - but I think it is very difficult to do right with the current
design of easy_install (I have thought a bit about those issues
recently, and I have started writing something to organize my thought a
bit better - I can keep you posted if you are interested).
> By the way, for the libgfortran issue, while statically linking it may 
> be the best option, it wouldn't be too hard to have the mpkg include and 
> install /usr/local/lib/ligfortran.dylib  (or whatever).
>   

I don't think it is a good idea: it would overwrite existing
libgfortran.dylib, which would cause a lot of issues because libgfortran
and gfortran have to be consistent. I know I would be very pissed if
after installing a software, some unrelated software would be broken or
worse overwritten. That's exactly what bothers me with easy_install.

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] DVCS at PyCon

2009-03-30 Thread Eric Firing
David Cournapeau wrote:
> On Tue, Mar 31, 2009 at 6:16 AM, Bruce Southey  wrote:
>> It is now official that Python will switch to Mercurial (Hg):
>> http://thread.gmane.org/gmane.comp.python.devel/102706
>>
>> Not that it directly concerns me, but this is rather surprising given:
>> http://www.python.org/dev/peps/pep-0374/
> 
> I don't think it is: as Guido said in his email, someone has to make
> the decision, and endless discussion go nowhere, because you can
> always make arguments for one or the other. Since some core developers
> are strongly against git (Martin Loewis for example), and given that
> hg is used by several core python developers already, I think it makes
> sense.

I agree.  The PEP does not show overwhelming superiority (or, arguably, 
even mild superiority) of any alternative; I think the different systems 
have been tending to converge in their capabilities, and all are 
serviceable.  Mercurial *can* be viewed as easier to learn and use than 
git, and much faster than bzr.  Perhaps of interest to the numpy 
community is that mercurial is already in use by Sphinx, sage, and cython.

Disclosure: I use and like hg.

Eric

> 
> cheers,
> 
> David
> ___
> Numpy-discussion mailing list
> Numpy-discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] DVCS at PyCon

2009-03-30 Thread David Cournapeau
Eric Firing wrote:
>
> I agree.  The PEP does not show overwhelming superiority (or, arguably, 
> even mild superiority) of any alternative

I think this PEP was poorly written. You can't see any of the
advantage/differences of the different systems. Some people even said
they don't see the differences with svn. I think the reason partly is
that the PEP focused on existing python workflows, but the whole point,
at least for me, is to change the general workflow (for reviews, code
contributions, etc...). Stephen J. Turnbull sums it up nicely:

http://mail.python.org/pipermail/python-dev/2009-March/087968.html

FWIW, I tend to agree that Hg is less disruptive than git when coming
from svn, at least for the simple tasks (I don't know hg enough to have
a really informed opinion for more advanced workflows).

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion