Hi there,

On Mon, 21 Feb 2022, An Schall via clamav-users wrote:

... my issue is that when scanning folders recursively with
clamdscan, I merely receive an aggregated result on the entire folder
...
My aim is to log the per-file information to a configurable log file.

You could use the system logging facilities (syslog, rsyslog, ...) to
duplicate or redirect information which they already log somewhere to
an additional log file, or a different one.  I use syslog-ng.  I guess
it's not to everyone's taste but it's quite flexible.  You might need
to script something to change the logfile name to whatever you need.

There are plenty of alternatives.  For example the 'find' utility:

8<----------------------------------------------------------------------
$ find /home/ged/hexdump_formats -type f | xargs clamdscan
/home/ged/hexdump_formats/format_16.spec_for_hexdump: OK
/home/ged/hexdump_formats/format_64.spec_for_hexdump: OK

----------- SCAN SUMMARY -----------
Infected files: 0
Time: 2.825 sec (0 m 2 s)
Start Date: 2022:02:21 09:17:23
End Date:   2022:02:21 09:17:26
8<----------------------------------------------------------------------

The 'find' utility can sometimes feel a bit like a cornered rat but it
does the job well.  You might need to consider how many arguments are
passed and how they are passed; filenames with spaces etc. could be an
issue unless you use the right 'find' and 'xargs' options.  Efficiency
can never be taken for granted but if you're running a clamd daemon at
least you aren't reloading the database for every file scanned.

--

73,
Ged.

_______________________________________________

clamav-users mailing list
clamav-users@lists.clamav.net
https://lists.clamav.net/mailman/listinfo/clamav-users


Help us build a comprehensive ClamAV guide:
https://github.com/vrtadmin/clamav-faq

http://www.clamav.net/contact.html#ml

Reply via email to