On Thursday, 31 January 2013 at 23:53:26 UTC, Steven
Schveighoffer wrote:
On Thu, 31 Jan 2013 18:27:59 -0500, Jeremy DeHaan
<dehaan.jerem...@gmail.com> wrote:
On Wednesday, 30 January 2013 at 10:29:26 UTC, monarch_dodra
wrote:
To add to that, you also have to keep in mind that when the
program terminates (even legally), instead of running a
*full* collect cycle, the program just leaves, and lets the
OS clear any allocated memory. This is both faster, and safer.
What this means is that while there is a guarantee that
"collection=>destruction", there is no guarantee that actual
collection will happen.
If you absolutely must be sure that something allocated gets
*destroyed*, either destroy it yourself via an explicit call,
or bind it to a stack based RAII scheme, possibly with
reference counting.
So there is no guarantee at all that a destructor will be
called even at the end of the program? Because there is an
example in the book using a class destructor to free allocated
data.
I'm pretty sure all GCs do not have a guarantee of running all
destructors. I think D's GC makes a good effort to do so.
I definitely understand now about how not to rely on a
destructor to free up memory during runtime, but it seems
counterintuitive to have the ability to write a destructor
with no guarantee it would ever be called even at cleanup.
A destructor should ONLY be used to free up resources other
than GC allocated memory. Because of that, it's generally not
used.
It should be used almost as a "last resort".
For example, a class that holds a file descriptor should have
both a destructor (which closes the descriptor) and a manual
close method. The former is to clean up the file descriptor in
case nobody thought to close it manually before all references
were gone, and the latter is because file descriptors are not
really managed by the GC, and so should be cleaned up when they
are no longer used.
This kind of gives us a paradox, since the class is managed via
the GC, how do you know it's no longer used (that is, how do
you know this is the last reference to it)? That is really up
to the application design. But I wouldn't recommend relying on
the GC to clean up your descriptors.
-Steve
This makes a whole lot of sense to me. I never realized that a GC
could be imperfect in this regard. I also don't know much about
GC design and implementation, but that was the purpose of this
thread!
I definitely like the idea of having a manual way of cleaning up
and a destructor as a back up for times such as these.
Thanks for all the help guys! I learned a lot!