Thanks for all of the suggestions; we are migrating to 64bit Python soon as well.
The environments are Win7 and Mac Maverics.
carray sounds like what you said Chris - more I just found at http://kmike.ru/python-data-structures/

- Ray Schumacher



At 12:31 PM 3/27/2014, you wrote:
On Thu, Mar 27, 2014 at 7:42 AM, RayS <<mailto:r...@blue-cove.com>r...@blue-cove.com> wrote:
I find this interesting, since I work with medical data sets of 100s
of MB, and regularly run into memory allocation problems when doing a
lot of Fourrier analysis, waterfalls etc. The per-process limit seems
to be about 1.3GB on this 6GB quad-i7 with Win7.


This sounds like 32 bit -- have you tried a 64 bit Python_numpy? Nt that you wont have issues anyway, but you should be abel to do better than 1.3GB...
Â
 memmaps are also limited to RAM,


I don't think so, no -- but are limited to 2GB (I think) Â if you're using a 32 bit process

There is also a compressed array package out there -- I can't remember what it's called -- but if you have large compressible arrays -- that might help.
Â
-CHB


--

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R Â  Â  Â  Â  Â  Â (206) 526-6959Â Â  voice
7600 Sand Point Way NE Â Â (206) 526-6329Â Â  fax
Seattle, WA Â 98115 Â  Â  Â Â (206) 526-6317Â Â  main reception

<mailto:chris.bar...@noaa.gov>chris.bar...@noaa.gov
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to