Mike Ressler wrote:
> I'm trying to work with memmaps on very large files, i.e. > 2 GB, up 
> to 10 GB. The files are data cubes of images (my largest is 
> 1290(x)x1024(y)x2011(z)) and my immediate task is to strip the data 
> from 32-bits down to 16, and to rearrange some of the data on a 
> per-xy-plane basis. I'm running this on a Fedora Core 5 64-bit system, 
> with python-2.5b2 (that I believe I compiled in 64-bit mode) and 
> numpy-1.0b1. The disk has 324 GB free space.
I just discovered the problem.  All the places where 
PyObject_As<Read/Write>Buffer is used needs to have the final argument 
changed to Py_ssize_t (which in arrayobject.h is defined as int if you 
are using less than Python 2.5). 

This should be fixed in SVN shortly....

-Travis


-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/numpy-discussion

Reply via email to