<snip>
IMHO bad idea to use a web script to process log files of these size 
(please ignore this comment if you are using the command line version).
</snip>

Yes, this will be a script run from the command line.

<snip>
There are several good open source tools for parsing the apache log 
files (analog, webalizer, awstats to name a few). These are very fast 
and designed to handle large files that are generated by heavy traffic 
sites. You might want to look into it. Some of these log tools can 
produce a 'machine readable' as well.
</snip>

I'm not actually looking for stats in this case.  We had a very strange
occurrence yesterday wherein we had a few reports of porn links
appearing on one of our websites.  So basically I'm going to be looking
for all log entries relating to that one specific section of our site
(which won't be a huge number) and then I will just need to take a look
through them and see if there's anything strange there.

<snip>
finally 100MB chunks wouldn't be a problem. even 1.2gb wouldn't be a 
problem if you had raid and at least 512MB of memory.
</snip>

That's good.  I'm going to be doing this from my workstation so RAID
isn't an option so I guess I'll use split to break the file into 100MB
chunks.

Thanks for your help.

Cheers,
Pablo


Pablo Gosse wrote:

><snip>
>What kind of a log file are we talking here? regardless what processing

>you need to do generally working on a 1.2GB file with out RAID and/or 
>lots of memory is going to be slow.
>
>Pablo Gosse wrote:
>
>  
>
>>Hi folks.  Has anyone encountered any problems parsing large log files
>>with PHP?
>>
>>I've got a log file that's about 1.2 gig that I need to parse.
>>
>>Can PHP handle this or am I better of breaking this down into 12 100mb
>>chunks and processing it?
>>    
>>
></snip>
>
>It's an Apache log file.
>
>I'm going to have to parse this file outside of the web server,
probably
>on my desktop machine.  It's a Dell Precision with 1GB RAM running RH9
>with Apache and PHP 4.2.2.
>
>If I can get the log file broken down into 100MB chunks I assume this
>would not be a problem?
>
>I've not attempted to deal with the file yet as I didn't know how PHP
>would react to a 1.2 gig file, and I'm in the final stages of a very
>important project and cannot afford any downtime.
>
>I assume PHP can handle 100MB chunks without choking.
>
>Cheers and TIA.
>
>Pablo
>
>  
>


-- 
Raditha Dissanayake.
------------------------------------------------------------------------
http://www.radinks.com/sftp/         | http://www.raditha.com/megaupload
Lean and mean Secure FTP applet with | Mega Upload - PHP file uploader
Graphical User Inteface. Just 150 KB | with progress bar.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to