Why Premature end of script headers?

2002-07-15 Thread Octavian Rasnita

Hi all,

I've made a little script that takes the lines from a file, removes the
dupplicate lines and prints the result to a new file.
I've read the original file line by line because it is a big file (over 7
MB).
The problem is that after printing almost 10% from the original file into
the new file, the script dies and the only error I get in the log file is:

[Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of
script headers: clean.pl

I've checked the original file to see if there are  some strange characters
in that line, or in the next one, but I haven't found something that might
cause  that problem.

Please tell me why is the script dying only after it runs for a few minutes,
and after writing  over 11000 lines, and not from the beginning, if there is
a problem with the script.

Can you find any problems with my scriptt?

Thank you very very much!

Here is the script:

#!/perl/bin/perl -w

print Content-type: text/html\n\n;

#The original file:
my $file = f:/teddy/data/er.txt;
#The result:
my $out = f:/teddy/data/er_new_good.txt;

#Create the result file (empty):
open (OUT, $out);
print OUT ;
close OUT;

#Open the original file:
open (FILE, $file);
line: while (FILE) {
my $line = $_;

#Open the result file:
open (OUT, $out);
while (OUT) {

#Checks if the line from the original file exists in the result file:
if ($line eq $_) {
#If  the line exists, jump back and read the next line from the original
file:
next line;
}
else {
#Read the next line from the result file:
next;
}
#End while for the result file:
}
close OUT;

#Open the result file and append the line that was not found:
open (OUT, $out);
print OUT $line;
close OUT;
#Close the while loop for the original file:
}


Teddy Center: http://teddy.fcc.ro/
Mail: [EMAIL PROTECTED]



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Why Premature end of script headers?

2002-07-15 Thread Bob Showalter

 -Original Message-
 From: Octavian Rasnita [mailto:[EMAIL PROTECTED]]
 Sent: Monday, July 15, 2002 8:18 AM
 To: [EMAIL PROTECTED]
 Subject: Why Premature end of script headers?
 
 
 Hi all,
 
 I've made a little script that takes the lines from a file, 
 removes the
 dupplicate lines and prints the result to a new file.
 I've read the original file line by line because it is a big 
 file (over 7
 MB).
 The problem is that after printing almost 10% from the 
 original file into
 the new file, the script dies and the only error I get in the 
 log file is:
 
 [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of
 script headers: clean.pl
 
 I've checked the original file to see if there are  some 
 strange characters
 in that line, or in the next one, but I haven't found 
 something that might
 cause  that problem.
 
 Please tell me why is the script dying only after it runs for 
 a few minutes,
 and after writing  over 11000 lines, and not from the 
 beginning, if there is
 a problem with the script.
 
 Can you find any problems with my scriptt?

Probably your web server is timing out the request and killing your
script. Web servers don't like to run long-running processes like
this. Perhaps you can fork off a child and have the child take care
of it.

See http://www.stonehenge.com/merlyn/WebTechniques/col20.html for a
nice technique on this.

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Why Premature end of script headers?

2002-07-15 Thread Lisa Nyman

Hi,

On Mon, 15 Jul 2002, Octavian Rasnita wrote:

 I've made a little script that takes the lines from a file, removes the
 dupplicate lines and prints the result to a new file.
 The problem is that after printing almost 10% from the original file into
 the new file, the script dies and the only error I get in the log file is:
 [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of
 script headers: clean.pl

 #!/perl/bin/perl -w

 print Content-type: text/html\n\n;

I'm not sure why you are doing this.  You never output anything to
the web page aside  from the header.  Did you want this to be a web
application?  If so, you know the web server needs the correct
permissions to manipulate the files.  You would catch permission problems
if you check the error return $! from your open statements.  That's
always a wise thing to do.

As for editing files, take a look at Tie::File.

-lisa


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Why Premature end of script headers?

2002-07-15 Thread perl-dvd

 Probably your web server is timing out the request and killing your
 script. Web servers don't like to run long-running processes like
 this. Perhaps you can fork off a child and have the child take care
 of it.

Another solution is to have something like the following in your loop:
--
local $|=1; # print things right away (no buffering)
# $ltime and $ctime should be defined before the loop
$ctime = time();
if ($ltime ne $ctime){
print  . ;
$ltime = $ctime;
}
--
Then the browser does not time out because it continually gets information, so it 
simply prints
another period to the browser every one second that it is working.  This can help you 
know its still
working.

Regards,
David



- Original Message -
From: Bob Showalter [EMAIL PROTECTED]
To: 'Octavian Rasnita' [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Monday, July 15, 2002 9:28 AM
Subject: RE: Why Premature end of script headers?


 -Original Message-
 From: Octavian Rasnita [mailto:[EMAIL PROTECTED]]
 Sent: Monday, July 15, 2002 8:18 AM
 To: [EMAIL PROTECTED]
 Subject: Why Premature end of script headers?


 Hi all,

 I've made a little script that takes the lines from a file,
 removes the
 dupplicate lines and prints the result to a new file.
 I've read the original file line by line because it is a big
 file (over 7
 MB).
 The problem is that after printing almost 10% from the
 original file into
 the new file, the script dies and the only error I get in the
 log file is:

 [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of
 script headers: clean.pl

 I've checked the original file to see if there are  some
 strange characters
 in that line, or in the next one, but I haven't found
 something that might
 cause  that problem.

 Please tell me why is the script dying only after it runs for
 a few minutes,
 and after writing  over 11000 lines, and not from the
 beginning, if there is
 a problem with the script.

 Can you find any problems with my scriptt?

Probably your web server is timing out the request and killing your
script. Web servers don't like to run long-running processes like
this. Perhaps you can fork off a child and have the child take care
of it.

See http://www.stonehenge.com/merlyn/WebTechniques/col20.html for a
nice technique on this.

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




RE: Why Premature end of script headers?

2002-07-15 Thread Bob Showalter

 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
 Sent: Monday, July 15, 2002 12:23 PM
 To: 'Octavian Rasnita'; [EMAIL PROTECTED]
 Subject: Re: Why Premature end of script headers?
 
 
  Probably your web server is timing out the request and killing your
  script. Web servers don't like to run long-running processes like
  this. Perhaps you can fork off a child and have the child take care
  of it.
 
 Another solution is to have something like the following in your loop:
 --
 local $|=1; # print things right away (no buffering)
 # $ltime and $ctime should be defined before the loop
 $ctime = time();
 if ($ltime ne $ctime){
 print  . ;
 $ltime = $ctime;
 }
 --
 Then the browser does not time out because it continually 
 gets information, so it simply prints
 another period to the browser every one second that it is 
 working.  This can help you know its still
 working.

True. But there are two possible downsides to consider:

1. It ties up a server process for an extended period.
2. The server will kill the CGI script if the client goes away
or the connection is lost.

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Why Premature end of script headers?

2002-07-15 Thread perl-dvd

Yes, your right.
It seems to me there has got to be a more efficient way to accomplish what is 
being attempted.
Right now, the solution is making a copy of the file, then checking every line of the 
original
against every line of the copy.  This method becomes exponentially more processor 
intensive for
additional lines to the file.  Perhaps a recognize would be to loop through the lines, 
give them a
line number, and write them to a copy.  Then sort the copy alphanumerically, then loop 
through the
copy checking one line against the next, and if the next is a duplicate, through it 
away, and
continue checking it against the next until they are not the same.  As soon as their 
not the same,
take the line that is not the same and begin checking it against the next line.  So on 
and so forth.
Because every like line will be right next to each other, you don't have to check each 
line against
every other line, just the ones next to it.  After your done eliminating the 
duplicates, sort the
file again by line number (this only if its necessary to keep them in a certain 
order), and your
done.  Granted this method requires that you find some efficient manner of sorting the 
lines in
large text files, but if somebody knows of something that can do this, viola.

Regards,
David


- Original Message -
From: Bob Showalter [EMAIL PROTECTED]
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Monday, July 15, 2002 12:00 PM
Subject: RE: Why Premature end of script headers?


 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
 Sent: Monday, July 15, 2002 12:23 PM
 To: 'Octavian Rasnita'; [EMAIL PROTECTED]
 Subject: Re: Why Premature end of script headers?


  Probably your web server is timing out the request and killing your
  script. Web servers don't like to run long-running processes like
  this. Perhaps you can fork off a child and have the child take care
  of it.

 Another solution is to have something like the following in your loop:
 --
 local $|=1; # print things right away (no buffering)
 # $ltime and $ctime should be defined before the loop
 $ctime = time();
 if ($ltime ne $ctime){
 print  . ;
 $ltime = $ctime;
 }
 --
 Then the browser does not time out because it continually
 gets information, so it simply prints
 another period to the browser every one second that it is
 working.  This can help you know its still
 working.

True. But there are two possible downsides to consider:

1. It ties up a server process for an extended period.
2. The server will kill the CGI script if the client goes away
or the connection is lost.

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]