On Wed, Jan 14, 2009 at 7:12 PM, Eric Wilhelm
<[email protected]> wrote:
> # from Adam Kennedy
> # on Wednesday 14 January 2009 17:00:
>
>>Indefinite retry is not a good idea, but limited retry should cover
>>almost every case that is laggy without preventing legitimate failures
>>from happening.
>
> Of course not and yes, Certainly.
>
> do {unlink($file) and last; -e $file and die $!;
> warn "cannot unlink $file\n"; sleep 1; } for(1..20);
I'm not sure what we're talking about. Is it "1 while unlink $f"?
If so, see this text from perlport.pod:
Don't assume that a single "unlink" completely gets rid of the file:
some filesystems (most notably the ones in VMS) have versioned
filesystems, and unlink() removes only the most recent one (it doesn't
remove all the versions because by default the native tools on those
platforms remove just the most recent version, too). The portable
idiom to remove all the versions of a file is
1 while unlink "file";
-Ken