Re: Managing very large files

2007-10-05 Thread Jorn Argelo
Steve Bertrand wrote: man 1 split (esp. -l) That's probably the best option for a one-shot deal like this. On the other hand, Perl itself provides the ability to go through a file one line at a time, so you could just read a line, operate, write a line (to a new file) as needed, over

Re: Managing very large files

2007-10-05 Thread Steve Bertrand
The reason for the massive file size was my haste in running out of the office on Friday and forgetting to kill the tcpdump process before the weekend began. Sounds like you may want a Perl script to automate managing your tcpdumps. 99% of the time I use tcpdump for less than one minute to

Re: Managing very large files

2007-10-05 Thread Steve Bertrand
Check out Tie::File on CPAN. This Perl module treats every line in a file as an array element, and the array element is loaded into memory when it's being requested. In other words: This will work great with huge files such as these, as not the entire file is loaded into memory at once.

Re: Managing very large files

2007-10-05 Thread Bart Silverstrim
Steve Bertrand wrote: Heiko Wundram (Beenic) wrote: Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand: This is what I am afraid of. Just out of curiosity, if I did try to read the entire file into a Perl variable all at once, would the box panic, or as the saying goes 'what could

Re: Managing very large files

2007-10-04 Thread Steve Bertrand
Heiko Wundram (Beenic) wrote: Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand: Is there any way to accomplish this, preferably with the ability to incrementally name each newly created file? man 1 split Thanks. Sheesh it really was that easy. *puts head in sand* Steve

Re: Managing very large files

2007-10-04 Thread Heiko Wundram (Beenic)
Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand: Is there any way to accomplish this, preferably with the ability to incrementally name each newly created file? man 1 split (esp. -l) -- Heiko Wundram Product Application Development

Managing very large files

2007-10-04 Thread Steve Bertrand
Hi all, I've got a 28GB tcpdump capture file that I need to (hopefully) break down into a series of 100,000k lines or so, hopefully without the need of reading the entire file all at once. I need to run a few Perl processes on the data in the file, but AFAICT, doing so on the entire original

Re: Managing very large files

2007-10-04 Thread Giorgos Keramidas
On 2007-10-04 08:43, Steve Bertrand [EMAIL PROTECTED] wrote: Hi all, I've got a 28GB tcpdump capture file that I need to (hopefully) break down into a series of 100,000k lines or so, hopefully without the need of reading the entire file all at once. I need to run a few Perl processes on the

Re: Managing very large files

2007-10-04 Thread Chad Perrin
On Thu, Oct 04, 2007 at 02:58:22PM +0200, Heiko Wundram (Beenic) wrote: Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand: Is there any way to accomplish this, preferably with the ability to incrementally name each newly created file? man 1 split (esp. -l) That's probably

Re: Managing very large files

2007-10-04 Thread Steve Bertrand
man 1 split (esp. -l) That's probably the best option for a one-shot deal like this. On the other hand, Perl itself provides the ability to go through a file one line at a time, so you could just read a line, operate, write a line (to a new file) as needed, over and over, until you get

Re: Managing very large files

2007-10-04 Thread Heiko Wundram (Beenic)
Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand: This is what I am afraid of. Just out of curiosity, if I did try to read the entire file into a Perl variable all at once, would the box panic, or as the saying goes 'what could possibly go wrong'? Perl most certainly wouldn't make

Re: Managing very large files

2007-10-04 Thread Chad Perrin
On Thu, Oct 04, 2007 at 04:25:18PM -0400, Steve Bertrand wrote: Heiko Wundram (Beenic) wrote: Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand: This is what I am afraid of. Just out of curiosity, if I did try to read the entire file into a Perl variable all at once, would the

Re: Managing very large files

2007-10-04 Thread Chad Perrin
On Thu, Oct 04, 2007 at 04:16:29PM -0400, Steve Bertrand wrote: man 1 split (esp. -l) That's probably the best option for a one-shot deal like this. On the other hand, Perl itself provides the ability to go through a file one line at a time, so you could just read a line, operate,

Re: Managing very large files

2007-10-04 Thread Mel
On Thursday 04 October 2007 22:16:29 Steve Bertrand wrote: man 1 split (esp. -l) That's probably the best option for a one-shot deal like this. On the other hand, Perl itself provides the ability to go through a file one line at a time, so you could just read a line, operate, write

Re: Managing very large files

2007-10-04 Thread Steve Bertrand
Heiko Wundram (Beenic) wrote: Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand: This is what I am afraid of. Just out of curiosity, if I did try to read the entire file into a Perl variable all at once, would the box panic, or as the saying goes 'what could possibly go wrong'?

Re: Managing very large files

2007-10-04 Thread Jerry McAllister
On Wed, Oct 03, 2007 at 04:51:08PM -0600, Chad Perrin wrote: On Thu, Oct 04, 2007 at 04:25:18PM -0400, Steve Bertrand wrote: Heiko Wundram (Beenic) wrote: Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand: This is what I am afraid of. Just out of curiosity, if I did try to