Re: [Numpy-discussion] Huge arrays

2009-09-11 Thread Chad Netzer
On Tue, Sep 8, 2009 at 6:41 PM, Charles R Harris charlesr.har...@gmail.com wrote: More precisely, 2GB for windows and 3GB for (non-PAE enabled) linux. And just to further clarify, even with PAE enabled on linux, any individual process has about a 3 GB address limit (there are hacks to raise

Re: [Numpy-discussion] Huge arrays

2009-09-10 Thread Kim Hansen
On 9-Sep-09, at 4:48 AM, Francesc Alted wrote: Yes, this later is supported in PyTables as long as the underlying filesystem supports files 2 GB, which is very usual in modern operating systems. I think the OP said he was on Win32, in which case it should be noted: FAT32 has its

Re: [Numpy-discussion] Huge arrays

2009-09-10 Thread David Cournapeau
Kim Hansen wrote: On 9-Sep-09, at 4:48 AM, Francesc Alted wrote: Yes, this later is supported in PyTables as long as the underlying filesystem supports files 2 GB, which is very usual in modern operating systems. I think the OP said he was on Win32, in which

Re: [Numpy-discussion] Huge arrays

2009-09-09 Thread Francesc Alted
A Wednesday 09 September 2009 07:22:33 David Cournapeau escrigué: On Wed, Sep 9, 2009 at 2:10 PM, Sebastian Haaseseb.ha...@gmail.com wrote: Hi, you can probably use PyTables for this. Even though it's meant to save/load data to/from disk (in HDF5 format) as far as I understand, it can be

Re: [Numpy-discussion] Huge arrays

2009-09-09 Thread David Warde-Farley
On 9-Sep-09, at 4:48 AM, Francesc Alted wrote: Yes, this later is supported in PyTables as long as the underlying filesystem supports files 2 GB, which is very usual in modern operating systems. I think the OP said he was on Win32, in which case it should be noted: FAT32 has its upper

Re: [Numpy-discussion] Huge arrays

2009-09-08 Thread David Cournapeau
On Wed, Sep 9, 2009 at 9:30 AM, Daniel Platzmail.to.daniel.pl...@googlemail.com wrote: Hi, I have a numpy newbie question. I want to store a huge amount of data in  an array. This data come from a measurement setup and I want to write them to disk later since there is nearly no time for this

Re: [Numpy-discussion] Huge arrays

2009-09-08 Thread Charles R Harris
On Tue, Sep 8, 2009 at 7:30 PM, Daniel Platz mail.to.daniel.pl...@googlemail.com wrote: Hi, I have a numpy newbie question. I want to store a huge amount of data in an array. This data come from a measurement setup and I want to write them to disk later since there is nearly no time for

Re: [Numpy-discussion] Huge arrays

2009-09-08 Thread Sturla Molden
Daniel Platz skrev: data1 = numpy.zeros((256,200),dtype=int16) data2 = numpy.zeros((256,200),dtype=int16) This works for the first array data1. However, it returns with a memory error for array data2. I have read somewhere that there is a 2GB limit for numpy arrays on a 32 bit

Re: [Numpy-discussion] Huge arrays

2009-09-08 Thread Sebastian Haase
Hi, you can probably use PyTables for this. Even though it's meant to save/load data to/from disk (in HDF5 format) as far as I understand, it can be used to make your task solvable - even on a 32bit system !! It's free (pytables.org) -- so maybe you can try it out and tell me if I'm right Or

Re: [Numpy-discussion] Huge arrays

2009-09-08 Thread David Cournapeau
On Wed, Sep 9, 2009 at 2:10 PM, Sebastian Haaseseb.ha...@gmail.com wrote: Hi, you can probably use PyTables for this. Even though it's meant to save/load data to/from disk (in HDF5 format) as far as I understand, it can be used to make your task solvable - even on a 32bit system !! It's free