Steve Bertrand wrote:
man 1 split
(esp. -l)
That's probably the best option for a one-shot deal like this. On the
other hand, Perl itself provides the ability to go through a file one
line at a time, so you could just read a line, operate, write a line (to
a new file) as needed, over
The reason for the massive file size was my haste in running out of the
office on Friday and forgetting to kill the tcpdump process before the
weekend began.
Sounds like you may want a Perl script to automate managing your
tcpdumps.
99% of the time I use tcpdump for less than one minute to
Check out Tie::File on CPAN. This Perl module treats every line in a
file as an array element, and the array element is loaded into memory
when it's being requested. In other words: This will work great with
huge files such as these, as not the entire file is loaded into memory
at once.
Steve Bertrand wrote:
Heiko Wundram (Beenic) wrote:
Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand:
This is what I am afraid of. Just out of curiosity, if I did try to read
the entire file into a Perl variable all at once, would the box panic,
or as the saying goes 'what could
Heiko Wundram (Beenic) wrote:
Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand:
Is there any way to accomplish this, preferably with the ability to
incrementally name each newly created file?
man 1 split
Thanks.
Sheesh it really was that easy.
*puts head in sand*
Steve
Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand:
Is there any way to accomplish this, preferably with the ability to
incrementally name each newly created file?
man 1 split
(esp. -l)
--
Heiko Wundram
Product Application Development
Hi all,
I've got a 28GB tcpdump capture file that I need to (hopefully) break
down into a series of 100,000k lines or so, hopefully without the need
of reading the entire file all at once.
I need to run a few Perl processes on the data in the file, but AFAICT,
doing so on the entire original
On 2007-10-04 08:43, Steve Bertrand [EMAIL PROTECTED] wrote:
Hi all,
I've got a 28GB tcpdump capture file that I need to (hopefully) break
down into a series of 100,000k lines or so, hopefully without the need
of reading the entire file all at once.
I need to run a few Perl processes on the
On Thu, Oct 04, 2007 at 02:58:22PM +0200, Heiko Wundram (Beenic) wrote:
Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand:
Is there any way to accomplish this, preferably with the ability to
incrementally name each newly created file?
man 1 split
(esp. -l)
That's probably
man 1 split
(esp. -l)
That's probably the best option for a one-shot deal like this. On the
other hand, Perl itself provides the ability to go through a file one
line at a time, so you could just read a line, operate, write a line (to
a new file) as needed, over and over, until you get
Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand:
This is what I am afraid of. Just out of curiosity, if I did try to read
the entire file into a Perl variable all at once, would the box panic,
or as the saying goes 'what could possibly go wrong'?
Perl most certainly wouldn't make
On Thu, Oct 04, 2007 at 04:25:18PM -0400, Steve Bertrand wrote:
Heiko Wundram (Beenic) wrote:
Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand:
This is what I am afraid of. Just out of curiosity, if I did try to read
the entire file into a Perl variable all at once, would the
On Thu, Oct 04, 2007 at 04:16:29PM -0400, Steve Bertrand wrote:
man 1 split
(esp. -l)
That's probably the best option for a one-shot deal like this. On the
other hand, Perl itself provides the ability to go through a file one
line at a time, so you could just read a line, operate,
On Thursday 04 October 2007 22:16:29 Steve Bertrand wrote:
man 1 split
(esp. -l)
That's probably the best option for a one-shot deal like this. On the
other hand, Perl itself provides the ability to go through a file one
line at a time, so you could just read a line, operate, write
Heiko Wundram (Beenic) wrote:
Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand:
This is what I am afraid of. Just out of curiosity, if I did try to read
the entire file into a Perl variable all at once, would the box panic,
or as the saying goes 'what could possibly go wrong'?
On Wed, Oct 03, 2007 at 04:51:08PM -0600, Chad Perrin wrote:
On Thu, Oct 04, 2007 at 04:25:18PM -0400, Steve Bertrand wrote:
Heiko Wundram (Beenic) wrote:
Am Donnerstag 04 Oktober 2007 22:16:29 schrieb Steve Bertrand:
This is what I am afraid of. Just out of curiosity, if I did try to
16 matches
Mail list logo