Unfortunately there are a few different cases.
Case 1: the problem is directories matching the name creduce-*
Case 1a: if C-Reduce is killed, it does not delete these
Case 1b: yesterday I fixed a bug that was causing C-Reduce to not
delete these directories even when it exits normally-- please update
your C-Reduce to get the fix
Either way, as Konstantin says, you need to remove these by hand.
Case 2: the problem is files left by your compiler when it is killed.
These should be periodically deleted by hand (you can do this even while
C-Reduce is running).
I'm reluctant to start playing timeout games. First, the goal here is
high throughput and any time we spend waiting for some compiler to exit
nicely is time we aren't spending doing something useful. Second, some
compilers have a -pipe option that causes them to produce fewer
temporary files; I suggest trying this. Third, at least in my
experience, these files accumulate quite slowly.
However, if people think it would be useful for all of C-Reduce's
temporary directories to be subdirectories, I'd be happy to implement
that (not before we release 2.1, however-- we're trying to test frozen
code right now).
John
On 07/10/2013 07:53 AM, Kees Bakker wrote:
Hi,
After running creduce there are lots of temp files in TMPDIR (i.e. on Linux
that is probably in /tmp.) I guess this is because of the
kill ('TERM', -$pid);
in the creduce Perl script.
Shouldn't we try to kill the compiler in a more friendly way first? Perhaps
with a TERM signal or something. That being said, I'm sure we need to
have a fallback procedure and really do a TERM if the compiler doesn't
want to be killed by TERM.
Meanwhile I decided to create a fresh temporary directory before running
creduce and cleanup that directory after creduce is done. Something like
this:
export TMPDIR=$(mktemp -d /tmp/mytempdir-XXXXXX)
creduce ....
[ -d "$TMPDIR" ] && rm -fr "$TMPDIR"