http://bugzilla.spamassassin.org/show_bug.cgi?id=3872





------- Additional Comments From [EMAIL PROTECTED]  2004-10-09 08:54 -------
Subject: Re:  SA 3.0 creates randomly extreme big bayes_journal

On Sat, Oct 09, 2004 at 07:54:27AM -0700, [EMAIL PROTECTED] wrote:
> However, I don't know if the block usage remains the same when I make a "cp 
> -a 
> dir dir.new". I tar-gzipped the following dir, it's now 11 MB. Do you want to 

Hrm.  I believe the issue is whether "cp" understands sparse files.  The Linux
cp I have (it looks like you're using Linux) seems to say it possibly supports
sparse files based on a "crude heuristic" to determine if the file is sparse
or not.  So to be safe, I wouldn't trust cp.

> have it? Can I upload it to an FTP or shall I provide it to you via ftp or 
> scp?

If you can make it available, I'll grab it from you.  I can make some ftp
space available if that's easier.

> crashes the machine eventually. I don't know of a way to limit this, AFAIK 
> ulimit applies to logins and this won't work for a daemon, correct?)

ulimit applies to processes and their children.  logins are simply a shell
with children procs. ;)  (BTW: "ulimit -c 0" is great for httpd and such to
prevent core files being written upon crash...)

>  46116 -rw-rw-rw-    1 root     www      47168872 Oct  8 16:54 bayes_journal
> 202004 -rw-rw-rw-    1 root     www      206639592 Oct  8 16:32 
> bayes_journal.old
>
>   File: `bayes_journal'
>   Size: 47168872        Blocks: 92232      IO Block: 4096   regular file
> 
>   File: `bayes_journal.old'
>   Size: 206639592       Blocks: 404008     IO Block: 4096   regular file

Both of these seem to be non-sparse.  In a cp version of the .old file,
I'd look for a bunch of text, then a bunch of nulls, then potentially
a bunch more text.





------- You are receiving this mail because: -------
You are the assignee for the bug, or are watching the assignee.

Reply via email to