Gerrit P. Haase wrote:

$ ./inter.pl
perl> sub foo($){$a=shift;foo($a+1);}
perl> foo 1
Out of memory during request for 4040 bytes, total sbrk() is 402624512 bytes!
Segmentation fault (core dumped)

Another version (with "my $a"):

perl> sub foo($){my $a=shift;foo($a+1);}
perl> foo 1
Out of memory during "large" request for 134221824 bytes, total sbrk() is 304633856 bytes at (eval 19) line 1.
perl> foo 1
Bad realloc() ignored at (eval 19) line 1.
Segmentation fault (core dumped)

Is this a perl bug, Cygwin bug, or just a feature?

I don't know.  Maybe it is a Windows feature that applications running
out of memory are crashing?

But there's plenty of memory left when perl crashes. I have 1 GB RAM and 1 GB swap file.

I've simplified the test case. It seems that Cygwin perl can't handle too much memory. For instance:

$ perl -e '$a="a"x(200 * 1024 * 1024); sleep 9'

OK, this could have failed because $a might require 200 MB of continuous space. But hashes don't, do they? Then why does the following code fail?

$ perl -e '$a="a"x(1024 * 1024);my %b; $b{$_}=$a for(1..400);sleep 9'

Or that one?

$ perl -e '$a="a"x(50 * 1024 * 1024);$b=$a;$c=$a;$d=$a;$e=$a;sleep 10'

On linux there's no such problem - perl can use all available memory.

Krzysztof Duleba


--
Unsubscribe info:      http://cygwin.com/ml/#unsubscribe-simple
Problem reports:       http://cygwin.com/problems.html
Documentation:         http://cygwin.com/docs.html
FAQ:                   http://cygwin.com/faq/

Reply via email to