[analog-help] Processing large (1.95g compressed) logs ...

2001-02-11 Thread The Hermit Hacker
I've got a site that generates ~200Meg of logs per day, and is up to almost 2gig compressed ... not porn, they supply e-cards for Yahoo and other sites like that ... if I try and run analog against them, it takes days, and then, it appears, it crashes, cause no results are generated ... I'm

[analog-help] unsubscribe

2001-02-11 Thread Alvaro Siman
unsubscribe This is the analog-help mailing list. To unsubscribe from this mailing list, send mail to [EMAIL PROTECTED] with "unsubscribe" in the main BODY OF THE MESSAGE. List archived at

Re: [analog-help] Processing large (1.95g compressed) logs ...

2001-02-11 Thread Chuck Pierce
well, the first thing you can do is not keep your logs compressed. By having analog uncompressed the log files eats up a TON of memory. Also, analog is very memory intensive. I would suggest that you check out http://www.analog.cx/docs/lowmem.html. Especially if you are hitting swap. - Chuck

RE: [analog-help] Processing large (1.95g compressed) logs ...

2001-02-11 Thread Dave Atkins
What are you using to unzip the files? How large are the individual log files? I am using a freeware gzip utility and processing the log files over a network share because I do not have enough room to uncompress my log files. This works amazingly well--far, far better than webtrends. My largest