Steven Schveighoffer wrote:
On Thu, 12 Aug 2010 13:05:53 -0400, Joe Greer <[email protected]>
wrote:
"Steven Schveighoffer" <[email protected]> wrote in
news:[email protected]:
Logically speaking if an object isn't destructed, then it lives
forever and if it continues to hold it's resource, then we have a
programming error. The GC is for reclaiming memory, not files. It
can take a long time for a GC to reclaim an object and you surely
don't want a file locked for that long anymore than you want it held
open forever. My point is, that it is a programming error to expect
the GC be involved in reclaiming anything but memory. IMO, the best
use of a finalizer is to error out if an object holding a resource
hasn't been destructed, because there is obviously a programming
error here and something has leaked. GCs aren't there to support
sloppy programming. They are there to make your life easier and
safer.
An open file maybe, but why should the compiler decide the severity of
not closing the resource? What if the resource is just some
C-malloc'd memory?
That's the only example of an nearly unlimited resource which I've heard
thus far, but that raises the question, why the hell would you be using
malloc if you're not going to free it, when you have a language with a
gc? Effectively, the gc is freeing your malloced memory.
I can think of a couple of reasons for using malloc, but finalisers
aren't a good solution to any of them.
That's a possible solution. I just don't like the blanket assumptions
being made.
Actually it's the absence of a use case.
Hypothesis: if a finalizer is run, where it actually DOES something (as
opposed to, for example, running a pile of asserts), there's always a bug.
It's an extreme hypothesis, so it should be really easy to disprove.
Can you come up with a counterexample?