On 4/4/2019 9:01 PM, Tim Hawkins wrote:
> We have a large number of services running inside kubernetes that need to have
> access to clamav,  given the sheer number, i dont want to have to run 
> freshclam
> process on each virtual machine (container), due to the managemeht and 
> monitoring
> overhead, and the risk of some not updating for variouis reasons.
>
> Is there any easy way i can share the directories containing the definition
> database on one server image to all the others so i only have one machine to
> monitor updates on, we can use docker.kubernetes ability to share persistent
> volumes to do this,  we will be running clamav in single file scan mode, and 
> wont
> be using the daemon, so syncronising restart of the daemon on updates is not 
> required.

If you are simply scanning single files and loading the databases every time, 
then
you should be able to share the database directory with whatever method you have
available.

On the other hand, keep in mind that it can take time for clamscan to load the
databases (especially for slower systems or if you have lots of third-party
signatures).  If you have any volume at all, you may want to use the daemon 
instead
since it is MUCH faster.  One solution would be to run the daemon on one server 
and
open a TCP port so the other servers can connect to it with clamdscan to do 
scans. 
That way you only have one database directory and one daemon process to worry 
about.

-- 
Bowie
_______________________________________________

clamav-users mailing list
clamav-users@lists.clamav.net
https://lists.clamav.net/mailman/listinfo/clamav-users


Help us build a comprehensive ClamAV guide:
https://github.com/vrtadmin/clamav-faq

http://www.clamav.net/contact.html#ml

Reply via email to