To lose the stress on the DB you can use a custom format as Amos suggested but..

I think that when you will define and write what you want to log exactly you 
will get what you need and want.

 

The general squid access log is pretty lose and I believe that with these days 
hardware the difference will only be seen on systems with thousands or millions 
of clients requests.

If this is a small place it’s not required.

 

All The Bests,

Eliezer

 

----

 <http://ngtech.co.il/lmgtfy/> Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il



 

From: Alex K <rightkickt...@gmail.com> 
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <elie...@ngtech.co.il>
Cc: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Collecting squid logs to DB

 

+++ Including list +++

Hi Eliezer, 

I have used the following lines to instruct squid to log at mariadb: 

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid 
<http://127.0.0.1/squid_log/access_log/squid/squid>  squid

Through testing it seems that sometimes squid is not logging anything. I don't 
know why. After a restart it seems to unblock and write to DB. 

The access_log table is currently InnoDB and I am wondering if MyISAM will 
behave better. 

 

I would prefer if I could have real time access log. My scenario is that when a 
user disconnects from squid, an aggregated report of the sites that the user 
browsed will be available under some web portal where the user has access. 
Usually there will be up to 20 users connected concurrently so I have to check 
if this approach is scalable. If this approach is not stable then I might go 
with log parsing (perhaps logstash or some custom parser) which will parse and 
generate an aggregated report once per hour or day. 

Is there a way I format the log and pipe to DB only some interesting fields in 
order to lessen the stress to DB?

 

 

On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <elie...@ngtech.co.il 
<mailto:elie...@ngtech.co.il> > wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also 
an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il <mailto:elie...@ngtech.co.il> 



 

From: squid-users <squid-users-boun...@lists.squid-cache.org 
<http://squid-cache.org> > On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: squid-users@lists.squid-cache.org 
<mailto:squid-users@lists.squid-cache.org> 
Subject: [squid-users] Collecting squid logs to DB

 

Hi all, 

I had a previous setup on Debian 7 with squid and I was using mysar to collect 
squid logs and store them to DB and provide some browsing report at the end of 
the day. 

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not 
compile. 

Checking around I found mysar-ng but this has compilation issues on Debian 9 
also. 

Do you suggest any tool that does this job? Does squid support logging to DB 
natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper. 

 

Thanx a bunch,

Alex

 

_______________________________________________
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

Reply via email to