* Ton Hospel <[EMAIL PROTECTED]> [2003-09-01 10:03]:
> > Well, maybe... the not unreasonable program "@h{0..3e8}=0..3e8" uses 535M of
> > memory on my system although it only uses 3 million hash entries.  It seems to
> > me that a program that runs in a reasonable time on a system with a reasonable
> > amount of memory ought not to be rejected.  What is reasonable is open to
> > interpretation, and of course a "reasonable" amount of available memory may
> > increase with time.  Maybe there could be an "official" system with an
> > "official" build of Perl on which potentially-rejectable solutions can be
> > tested?  I doubt it would need to be used very often, but at least it would be
> > available for situations like this.
> 
> That's almost 5 times as much much memory as one of my main development
> machines (128M). My main machine only got 768M on my last memory upgrade,
> it was 256M before that. For many real machines this is DEEP in
> swapping land, which can slow down things be more than a factor of 1000.

Right... in which case a program that uses that much memory may exceed the time
constraints on a system with less real memory.  (By the way, I meant to write
3e6 instead of 3e8, of course.)

> It's getting of the order of magnitude glibc can allocate AT ALL when using
> malloc (which is about 900M, after that the load area of shared libraries
> starts). Many other libc's have simular limits.

I don't know much about this, but clearly if the system can't handle it, then
the program is invalid.  But that's why I proposed having an "official" system
to run questionable programs on, so it could be clearly determined whether "the
system" can handle it.

> For the automated golf mtve was working on we were thinking of giving you 64M.
> 
> In actual TPR's programs have been rejected for less memory usage.

That is fine... and if the solutions currently under discussion have to be
rejected for the same reason, it's understandable.  But I still think it's too
bad that there isn't currently an objective way of determining whether a
program uses an excessive amount of memory.

> Usually the "a few million thinges" rule is realy a *gift* sometimes
> allowing unreasonable memory use. 535M without context I would call
> totally unreasonable.

Yeah, I think that's getting up in the unreasonable range too.  As I mentioned
before, I tend to be more empirical than theoretical.  So for me, actually
running a solution on a real system with real inputs and seeing its behavior is
more meaningful to me than calculating a solution's behavior based on analysis
of the program and possible inputs.  If a solution actually fails (out of
memory) or uses excessive time on a "reasonable" system with pathological
input, then I say it should be rejected.  But at this point, that's not what
the rules say, so based on the current rules the solutions may have be rejected
anyway.  But that's for someone else to decide :-)  (Whose decision is it,
anyway?  Is it Mtv?)

-- Mike

-- 
Michael W. Thelen
In general, they do what you want, unless you want consistency.
    --Larry Wall in the perl man page

Reply via email to