RE: Processing Large Files

2002-03-13 Thread RArul

One of the primodial reasons for me to chop the file and see it was to see
what it looks like! Since it worked for you for a 1GB sized file, within a
wink of the eye,I am sure that I have strong suspicion in the line length. I
guess, it might be one large 300 MB file with a single line content! That
explains it. Let me investigate and if that is what it turns out to be, I
guess I should use read(), tell() functions.

Thanks,
Rex

-Original Message-
From: Nikola Janceski [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, March 13, 2002 11:29 AM
To: '[EMAIL PROTECTED]'; [EMAIL PROTECTED]
Subject: RE: Processing Large Files


I cut and paste your code. Used a file 1070725103 (~1 Gb) bytes big with
about 100 bytes per line.
and it ran in split second.

Are the lines in your file giant sized (>1Mb/line)?

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, March 13, 2002 11:20 AM
To: [EMAIL PROTECTED]
Subject: Processing Large Files


Friends,

I need to process 300+ MB text files. I tried to open one such file; read 10
lines (line by line) and output those 10 lines to a new text file. Despite
reading line by line, it was taking a very long time for processing. Is
there anything that I can try to speed it up!!

Here is my pronto code:

###
#!/usr/bin/perl
use strict;
my $from= "humongous.txt";
open(INP, $from) or die "cannot open input file $!";
open(OUT, ">sample.txt") or die "cannot open input file $!";
print("trying to read input file\n");
while(){
print STDOUT;
print OUT;
last if($. >= 10);
}
close(INP);
close(OUT);
##


Thanks,
Rex




The views and opinions expressed in this email message are the sender's
own, and do not necessarily represent the views and opinions of Summit
Systems Inc.



RE: Processing Large Files

2002-03-13 Thread Nikola Janceski

I cut and paste your code. Used a file 1070725103 (~1 Gb) bytes big with
about 100 bytes per line.
and it ran in split second.

Are the lines in your file giant sized (>1Mb/line)?

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, March 13, 2002 11:20 AM
To: [EMAIL PROTECTED]
Subject: Processing Large Files


Friends,

I need to process 300+ MB text files. I tried to open one such file; read 10
lines (line by line) and output those 10 lines to a new text file. Despite
reading line by line, it was taking a very long time for processing. Is
there anything that I can try to speed it up!!

Here is my pronto code:

###
#!/usr/bin/perl
use strict;
my $from= "humongous.txt";
open(INP, $from) or die "cannot open input file $!";
open(OUT, ">sample.txt") or die "cannot open input file $!";
print("trying to read input file\n");
while(){
print STDOUT;
print OUT;
last if($. >= 10);
}
close(INP);
close(OUT);
##


Thanks,
Rex




The views and opinions expressed in this email message are the sender's
own, and do not necessarily represent the views and opinions of Summit
Systems Inc.


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Processing Large Files

2002-03-13 Thread RArul

Friends,

I need to process 300+ MB text files. I tried to open one such file; read 10
lines (line by line) and output those 10 lines to a new text file. Despite
reading line by line, it was taking a very long time for processing. Is
there anything that I can try to speed it up!!

Here is my pronto code:

###
#!/usr/bin/perl
use strict;
my $from= "humongous.txt";
open(INP, $from) or die "cannot open input file $!";
open(OUT, ">sample.txt") or die "cannot open input file $!";
print("trying to read input file\n");
while(){
print STDOUT;
print OUT;
last if($. >= 10);
}
close(INP);
close(OUT);
##


Thanks,
Rex