Hi, Steve,

$PDL::BIGPDL is a global variable that disables a safety/sanity check in the PDL engine. Setting the variable to 1 lets you declare a PDL as large as you want; otherwise the engine won't let you work with any single variable larger than 1GB. (When PDL was written 12 years ago, a 1GB array was a Big Deal and even allocating one could make many machines thrash; now many folks keep bumping against that limit.)

As to your very large variable question, what strategy you use really depends on what you are doing. Your I/O speed will be much higher if you store the pieces of your too-big array in binary, rather than ASCII, format. I often find myself using PDL::DiskCache; a DiskCache object is a tied Perl list (you treat it exactly like a normal Perl list ref) whose elements are PDLs that are stored on-disk as FITS files. A certain number of the elements are cached in memory, so as long as you are accessing only a few elements of the list at once you won't notice any slowdown at all. That way you can treat your 100,000 individual files as elements of a Perl list ref, and they'll silently get swapped in and out behind your back as you motor through the whole data set.

I'm not familiar enough with rcols to know exactly what is up with your no-memory-at-all observation, but you may want to use help vars again after actually using your recently-read-in PDLs. Many operations defer allocation until it is absolutely necessary.




On Sep 14, 2008, at 8:11 PM, Steve Cicala wrote:

Thanks Chris--The output is attached.

As auxiliary question related to working with lots of data that the program can't hold all it once: If I have around 100,000 individual files containing a single column, all of the same length, for I/O speed is it better to convert all of the files to load using PDL::IO::FastRaw, or is it better to read it in from .txt each time using rcols()? I've notice that pdls loaded using rcols aren't taking up any memory (state VC in help vars).

Looking at Craig's output, it could be that I'm not calling $PDL::BIGPDL properly. When working in a script, is

$c=$PDL::BIGPDL=$a==$b->dummy

the way I allow $c to be >1Gb?

Many Thanks.

Steve

On Sun, Sep 14, 2008 at 2:40 PM, Chris Marshall <[EMAIL PROTECTED]> wrote:
Could you please run 'perldl -V' and send the output.
I could not reproduce the problem as I had out of memory
errors at that point.

Maybe someone with a large memory 64bit OS and
PDL could take a look further to debug the internals.

As a work around, just loop over your large $a values
by blocks small enough to run using slicing to extract
the values of interest for each step.

--Chris

Steve Cicala wrote:
Hi--I am running perldl on a unix server with 32Gb of memory, but am having a problem with the following operation:Here's a small version that works:
perldl> $a=sequence(5)

perldl> $b=pdl[1,4,0]

perldl> p $c=$a==$b->dummy

[
 [0 1 0 0 0]
 [0 0 0 0 1]
 [1 0 0 0 0]
]
(I am using this to collect indicies in a that correspond with elements in
b:
$d=which(maximum (($a==$b->dummy)->xchg(0,1))!=0)
).

Now, when I try to use this operation for large numbers (i.e. $a is 480000x1
and $b is 500x1) I get:

'multielement piddle in a conditional expression'

--And I get this error whether calculating $c on its own, or just sticking
the expression that generates $c into the one that generates $d.

I have also tried:

$c=$PDL::BIGPDL=$a==$b->dummy

and get the same error.

------------------------------------------------------------------------

_______________________________________________
Perldl mailing list
[email protected]
http://mailman.jach.hawaii.edu/mailman/listinfo/perldl
------------------------------------------------------------------------

No virus found in this incoming message.
Checked by AVG. Version: 7.5.526 / Virus Database: 270.6.21/1671 - Release Date: 9/14/2008 7:16 AM



<perldl_V.txt>_______________________________________________
Perldl mailing list
[email protected]
http://mailman.jach.hawaii.edu/mailman/listinfo/perldl

_______________________________________________
Perldl mailing list
[email protected]
http://mailman.jach.hawaii.edu/mailman/listinfo/perldl

Reply via email to