-----Original Message-----
From:   Joe Schell [SMTP:[EMAIL PROTECTED]]
Sent:   Tuesday, April 30, 2002 11:17 AM
To:     Jae Danner; [EMAIL PROTECTED]
Subject:        RE: Large files > 2GB

Sorry, ActivePerl 5.6.0.618 on Windows NT 4.0 SP6.
Jae
 


> -----Original Message-----
> Behalf Of Jae Danner
>
>
> I'm new to perl and I have some questions about using it to
> manipulate large files. I'm working a file that is 5,323,771,971
> bytes in size and has a record length of 353 (including CRLF). If
> I run this script against the file the script ends prematurely:
>

What version of perl are you using?

> open (FILE, "test.dat") or die("Can't open file\n");
> idx = 0;
> while(<FILE>) {
>       idx++;
> }
> print "records: ".idx."\n";
> close(FILE);
>
> The script above terminates and reports 15,077,180 records. The
> file however contains 15,081,507 records. I suspect that this
> file possibly contains a ^Z which triggers the early termination.
> I haven't yet tried the binmode() function, but I have
> succussfully read the entire file using sysread in the while
> statement: " while (sysread FILE, $line, 353) ".
> My concern is with data corruption. Even if I can read each
> line/record of data from a file of this size is there a danger
> that the data could become corrupt?

Are you concerned with sysread or perl in general?

Percentage wise there is probably less chance that perl will corrupt your
file than with writing you own code in C.  However there are no guarantees.
But if it does at least you don't have to blame yourself.

> What are the safe practices
> for manipulating files of this size? Or should I trash the whole
> idea and go back to C. Thanks.
>

_______________________________________________
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

_______________________________________________
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to