In article <[EMAIL PROTECTED]>,
        Michael W Thelen <[EMAIL PROTECTED]> writes:
> Well, maybe... the not unreasonable program "@h{0..3e8}=0..3e8" uses 535M of
> memory on my system although it only uses 3 million hash entries.  It seems to
> me that a program that runs in a reasonable time on a system with a reasonable
> amount of memory ought not to be rejected.  What is reasonable is open to
> interpretation, and of course a "reasonable" amount of available memory may
> increase with time.  Maybe there could be an "official" system with an
> "official" build of Perl on which potentially-rejectable solutions can be
> tested?  I doubt it would need to be used very often, but at least it would be
> available for situations like this.
>

That's almost 5 times as much much memory as one of my main development
machines (128M). My main machine only got 768M on my last memory upgrade,
it was 256M before that. For many real machines this is DEEP in
swapping land, which can slow down things be more than a factor of 1000.

It's getting of the order of magnitude glibc can allocate AT ALL when using
malloc (which is about 900M, after that the load area of shared libraries
starts). Many other libc's have simular limits.

For the automated golf mtve was working on we were thinking of giving you 64M.

In actual TPR's programs have been rejected for less memory usage.

Usually the "a few million thinges" rule is realy a *gift* sometimes
allowing unreasonable memory use. 535M without context I would call
totally unreasonable.

Reply via email to