Hi,

I’m wondering if anyone could assist me with an odd issue.

Recently when deploying new virtual machines which are built to use ClamAV’s on 
access features disk space is “consumed” over a period of many days until it 
eventually consumes all disk space and the server becomes non responsive (for 
obvious reasons).

I say consumed in quotes, because `df` reports disk space is consumed, but `du` 
does not. Here’s example output from a test box I spun up a few weeks back.

---------------------------------------------------------
~# du -d 1 -h /
...
...
2.6G    /
---------------------------------------------------------
~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G   14G  3.1G  82% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
---------------------------------------------------------

On boxes that are busy doing something they'll typically "fill up" and hanging 
within 2-3 days. The example above is a box I've been testing with and has been 
up around 15 days. It's only up to 82% full but hasn't been up to much.

Until a few days ago I didn't know which piece of software was causing the 
issue. It felt like something was keeping a filelock open type issue.

Today I've discovered if I killed clamonacc df starts reporting the correct 
available disk space within a few minutes (see attached txt file). 
Interestingly it took about 60 seconds to make all the space available. 
Restarting the clamav-daemon or clamav-freshclam didn't have any effect, only 
clamonacc.

Before I killed clamonacc I checked if it had any open file descriptors and if 
so, how many.
---------------------------------------------------------
# ps -C clamonacc -o pid=
715
---------------------------------------------------------
# ls -l /proc/715/fd/ | wc -l
...
...
lrwx------ 1 root root 64 Feb 21 17:33 997 -> 'socket:[4212413]'
lrwx------ 1 root root 64 Feb 21 17:33 998 -> 'socket:[4212414]'
lrwx------ 1 root root 64 Feb 21 17:33 999 -> 'socket:[4220923]'
---------------------------------------------------------

All in all there's were 1025 references to socket "files" in /proc/715/fd/, all 
timestamped about 25 minutes ago.

I'm not sure if it's relevant but clamonacc is started on system start up by 
cron as root using the following command

---------------------------------------------------------
/usr/bin/sleep 60 && /usr/sbin/clamonacc --fdpass
---------------------------------------------------------

OS details
Distributor ID: Debian
Description:    Debian GNU/Linux 11 (bullseye)
Release:        11
Codename:       bullseye

Any help will be gratefully received!

Cheers

Steve
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G   14G  3.1G  82% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# kill 715
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G   14G  3.1G  82% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G   12G  5.8G  66% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G  7.9G  9.0G  47% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G  3.7G   14G  23% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G  3.1G   14G  19% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G  2.8G   15G  17% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000
root@ss270122b:~# df -h
Filesystem      Size  Used Avail Use% Mounted on
udev            2.0G     0  2.0G   0% /dev
tmpfs           394M  624K  393M   1% /run
/dev/sda4        18G  2.8G   15G  17% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
/dev/sda1       470M   81M  365M  19% /boot
/dev/sda2       488M  3.5M  484M   1% /boot/efi
tmpfs           394M     0  394M   0% /run/user/1000

Attachment: clamd.conf
Description: Binary data

_______________________________________________

clamav-users mailing list
clamav-users@lists.clamav.net
https://lists.clamav.net/mailman/listinfo/clamav-users


Help us build a comprehensive ClamAV guide:
https://github.com/vrtadmin/clamav-faq

http://www.clamav.net/contact.html#ml

Reply via email to