Bug#631098: PHP Garbage Collection

2011-06-20 Thread Martin Meredith

Package: php5
Version: 5.3.6-12

I've had a problem where somehow, I've managed to end up with 
approximately 1,000,000 session files on my server.


Due to the large amount of files, the current crontab to clear them was 
unable to deal with it (xargs would fail to take in the HUGE list of files).


It seems that rather than using xargs (even with the limit), that using 
the -exec option of find might be a little bit more sane?



09,39 * * * * root   [ -x /usr/lib/php5/maxlifetime ]  [ -d 
/var/lib/php5 ]  find /var/lib/php5/ -type f -cmin 
+$(/usr/lib/php5/maxlifetime) -exec rm {} \;;


Is what I currently have in my cron script to work around this issue.



--
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#631098: [php-maint] Bug#631098: PHP Garbage Collection

2011-06-20 Thread Ondřej Surý
Hi,

5.3.6-12 has already updated crontab file without xargs.

Ondřej Surý

On 20.6.2011, at 10:34, Martin Meredith m...@debian.org wrote:

 Package: php5
 Version: 5.3.6-12
 
 I've had a problem where somehow, I've managed to end up with approximately 
 1,000,000 session files on my server.
 
 Due to the large amount of files, the current crontab to clear them was 
 unable to deal with it (xargs would fail to take in the HUGE list of files).
 
 It seems that rather than using xargs (even with the limit), that using the 
 -exec option of find might be a little bit more sane?
 
 
 09,39 * * * * root   [ -x /usr/lib/php5/maxlifetime ]  [ -d 
 /var/lib/php5 ]  find /var/lib/php5/ -type f -cmin 
 +$(/usr/lib/php5/maxlifetime) -exec rm {} \;;
 
 Is what I currently have in my cron script to work around this issue.
 
 
 
 ___
 pkg-php-maint mailing list
 pkg-php-ma...@lists.alioth.debian.org
 http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-php-maint



--
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#631098: PHP Garbage Collection

2011-06-20 Thread Bob Proulx
Martin Meredith wrote:
 I've had a problem where somehow, I've managed to end up with
 approximately 1,000,000 session files on my server.

That is definitely a very large number of files all in one place!

 Due to the large amount of files, the current crontab to clear them
 was unable to deal with it (xargs would fail to take in the HUGE
 list of files).

 It seems that rather than using xargs (even with the limit), that
 using the -exec option of find might be a little bit more sane?

I realize that the cron script has already changed in later releases
but just the same this shouldn't have been a problem with the previous
version that used xargs.  Using xargs is an older but quite acceptable
way to deal with a very large list of filenames.  I think that there
was some different problem related to this that wasn't diagnosed.

In other words, while using 'find . -exec command {} +' is more
efficient than 'find . -print0 | xargs -0 command' they are both
equivalent in functionality.  And using 'find . -conditions -delete'
is even better for safety and efficiency the 'xargs -0' version is
still functionally correct.  Today it is best to use -delete since it
is newly available in find.  But before the -delete and the {} +
forms were available then the 'find . -print0 | xargs -0 ...' form was
the standard of excellence and will handle directories with millions
of files in it.

In yet other words, I feel certain that the actual problem was
something else and not the use of 'find . ... -print0 | xargs -0 ...'
there.  For example perhaps memory was so limited that the Linux
kernel out-of-memory killer (OOM Killer) came into place?  Just as an
example and not necessarily saying that was the specific problem you
hit.

Bob



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org