On 02/01/2014 15:21, mani kandan wrote:
Hi,

We have file size of huge size 500MB, Need to Manipulate the file, some
replacement and then write the file, I have used File::slurp and works
for file size of 300MB (Thanks Uri) but for this huge size 500MB it is
not processing and come out with error. I have also used Tie::file
module same case as not processing, any guidance.

Slurping entire files into memory is usually overkill, and you should
only do it if you can aford the memory and *really need* random access
to the entire file at once. Most of the time a simple sequential
read/modify/write is appropriate, and Perl will take care of buffering
the input and output files in reasonable amounts.

According to your later posts you have just 2GB of memory, and although
Windows XP *can* run in 500MB I wouldn't like to see a program that
slurped a quarter of the entire memory.

I haven't seen you describe what processing you want to do on the file.
If the input is a text file and the changes can be done line by line,
then you are much better off with a program that looks like this

use strict;
use warnings;

open my $in, '<', 'myfile.txt' or die $!;
open my $out, '>', 'outfile.txt' or die $!;

while (<$in>) {
  s/from string/to string/g;
  print $out $_;
}

__END__

But if you need more, then I would guess that Tie::File is your best
bet. You don't say what problems you are getting using this module, so
please explain.

Rob




---
This email is free from viruses and malware because avast! Antivirus protection 
is active.
http://www.avast.com


--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to