Re: Check disc quota
Mike(mickalo)Blezien wrote: I am new to this list, so if this not the proper list to send this too, I would appreciate the name of the appropriate list. it's not the correct list. this is a beginners-cgi list. i would suggest the beginners mailing list. point your browser to learn.perl.org. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
graphic on-the-fly
Hey folks, I would like to have a script which produces graphics on the fly,b ut there are some problems: - When I write the graphic-file to /tmp/ I can't open it by writing img src=/tmp/pic.png in the page the script produces. But the pictures is verifiable there. - So, I tryed an alternativ: Pictures in the /var/www/ I can open in the page the script produces by writing img src=http://localhost/pic.png;. But here is the problem, that the script can't write the file to /var/www/ and get a Permission denied even I give everyone the right to write to that folder (actually I don't like this method because of the security hole) Any solutions? Konrad -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Benchmarking
Good Kind Perl Gurus, I see mention of benchmarking CGI scripts to see how quickly they run. What's the best way to do this? I'm in a hosted Unix IRIX environment so may not have access to the shell and other areas. #!/usr/local/bin/perl print ' EOF' Camilo Gonzalez Web Developer Taylor Johnson Associates [EMAIL PROTECTED] mailto:[EMAIL PROTECTED] www.taylorjohnson.com http://www.taylorjohnson.com/ EOF
Re: graphic on-the-fly
on Mon, 15 Jul 2002 13:03:45 GMT, Konrad Foerstner wrote: I would like to have a script which produces graphics on the fly,b ut there are some problems: See Randal Schwartz' Web Techniques Column 60 on Embedding a dynamic image in CGI output at http://www.stonehenge.com/merlyn/WebTechniques/col60.html -- felix -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: graphic on-the-fly
On Mon, 15 Jul 2002 15:03:45 +0200, [EMAIL PROTECTED] (Konrad Foerstner) wrote: Hey folks, I would like to have a script which produces graphics on the fly,b ut there are some problems: - When I write the graphic-file to /tmp/ I can't open it by writing img src=/tmp/pic.png in the page the script produces. But the pictures is verifiable there. - So, I tryed an alternativ: Pictures in the /var/www/ I can open in the page the script produces by writing img src=http://localhost/pic.png;. But here is the problem, that the script can't write the file to /var/www/ and get a Permission denied even I give everyone the right to write to that folder (actually I don't like this method because of the security hole) Any solutions? It depends on how you use the image. If it is just generated once, for that page, you can just output the image directly from the cgi by printing it. The GD and ImageMagick modules have methods for printing directly to the browser. Or you can open the picture in your cgi program, and just print it in binary mode to the browser. If you need to keep the picture around for awhile, you will need to figure out the permissions. It is possible to make a 777 directory under your main http directory, just for images, and you should be able to serve the picture from there. Otherwise, post more details of exactly what you need to do with the picture. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Counting the time with fractions of a second. The solution!
Teddy, Because times is a perl function, not necessarily a var. Here's what one of my perl books has to say about it: --- The times function returns the amount of job time consumed by this program and any child processes of this program. The syntax for the times function is @timelist = times As you can see, times accepts no arguments. It returns a list consisting of the following four floating-point numbers: - The user time consumed by this program - The system time consumed by this program - The user time consumed by the child processes, if they exist - The system time consumed by the child processes, if they exist --- So there you have it. When you are calling $begin = (times)[0]; You are calling times function and specifying that you only want to get back the first element of the returned array, which happens to be the user time consumed. Regards, David - Original Message - From: Octavian Rasnita [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Saturday, July 13, 2002 11:27 AM Subject: Counting the time with fractions of a second. The solution! Hi all, I found the easiest solution (until now) for calculating how much time a script runs. Thank you for other solutions, but they are too complicated. Here is the code you should use for calculating the time a script runs: my $start = (times)[0]; #Here goes the script. my $end = (times)[0]; my $duration = $end - $start; print The script ran for $duration seconds; This will print the duration in fractions of a second like 1.012, etc. This method doesn't require any module. I've used a more simple method a few months ago, but I don't remember it. I don't understand why (times)[0], but it works. I've tried putting $times[0] and if I use it only for finding the start time, it works, but if I use it for the start and the end time, it doesn't work. Can you make some light? Thank you very much. Teddy Center: http://teddy.fcc.ro/ Mail: [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Why Premature end of script headers?
Hi all, I've made a little script that takes the lines from a file, removes the dupplicate lines and prints the result to a new file. I've read the original file line by line because it is a big file (over 7 MB). The problem is that after printing almost 10% from the original file into the new file, the script dies and the only error I get in the log file is: [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of script headers: clean.pl I've checked the original file to see if there are some strange characters in that line, or in the next one, but I haven't found something that might cause that problem. Please tell me why is the script dying only after it runs for a few minutes, and after writing over 11000 lines, and not from the beginning, if there is a problem with the script. Can you find any problems with my scriptt? Thank you very very much! Here is the script: #!/perl/bin/perl -w print Content-type: text/html\n\n; #The original file: my $file = f:/teddy/data/er.txt; #The result: my $out = f:/teddy/data/er_new_good.txt; #Create the result file (empty): open (OUT, $out); print OUT ; close OUT; #Open the original file: open (FILE, $file); line: while (FILE) { my $line = $_; #Open the result file: open (OUT, $out); while (OUT) { #Checks if the line from the original file exists in the result file: if ($line eq $_) { #If the line exists, jump back and read the next line from the original file: next line; } else { #Read the next line from the result file: next; } #End while for the result file: } close OUT; #Open the result file and append the line that was not found: open (OUT, $out); print OUT $line; close OUT; #Close the while loop for the original file: } Teddy Center: http://teddy.fcc.ro/ Mail: [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Why Premature end of script headers?
-Original Message- From: Octavian Rasnita [mailto:[EMAIL PROTECTED]] Sent: Monday, July 15, 2002 8:18 AM To: [EMAIL PROTECTED] Subject: Why Premature end of script headers? Hi all, I've made a little script that takes the lines from a file, removes the dupplicate lines and prints the result to a new file. I've read the original file line by line because it is a big file (over 7 MB). The problem is that after printing almost 10% from the original file into the new file, the script dies and the only error I get in the log file is: [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of script headers: clean.pl I've checked the original file to see if there are some strange characters in that line, or in the next one, but I haven't found something that might cause that problem. Please tell me why is the script dying only after it runs for a few minutes, and after writing over 11000 lines, and not from the beginning, if there is a problem with the script. Can you find any problems with my scriptt? Probably your web server is timing out the request and killing your script. Web servers don't like to run long-running processes like this. Perhaps you can fork off a child and have the child take care of it. See http://www.stonehenge.com/merlyn/WebTechniques/col20.html for a nice technique on this. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Why Premature end of script headers?
Hi, On Mon, 15 Jul 2002, Octavian Rasnita wrote: I've made a little script that takes the lines from a file, removes the dupplicate lines and prints the result to a new file. The problem is that after printing almost 10% from the original file into the new file, the script dies and the only error I get in the log file is: [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of script headers: clean.pl #!/perl/bin/perl -w print Content-type: text/html\n\n; I'm not sure why you are doing this. You never output anything to the web page aside from the header. Did you want this to be a web application? If so, you know the web server needs the correct permissions to manipulate the files. You would catch permission problems if you check the error return $! from your open statements. That's always a wise thing to do. As for editing files, take a look at Tie::File. -lisa -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Why Premature end of script headers?
Probably your web server is timing out the request and killing your script. Web servers don't like to run long-running processes like this. Perhaps you can fork off a child and have the child take care of it. Another solution is to have something like the following in your loop: -- local $|=1; # print things right away (no buffering) # $ltime and $ctime should be defined before the loop $ctime = time(); if ($ltime ne $ctime){ print . ; $ltime = $ctime; } -- Then the browser does not time out because it continually gets information, so it simply prints another period to the browser every one second that it is working. This can help you know its still working. Regards, David - Original Message - From: Bob Showalter [EMAIL PROTECTED] To: 'Octavian Rasnita' [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: Monday, July 15, 2002 9:28 AM Subject: RE: Why Premature end of script headers? -Original Message- From: Octavian Rasnita [mailto:[EMAIL PROTECTED]] Sent: Monday, July 15, 2002 8:18 AM To: [EMAIL PROTECTED] Subject: Why Premature end of script headers? Hi all, I've made a little script that takes the lines from a file, removes the dupplicate lines and prints the result to a new file. I've read the original file line by line because it is a big file (over 7 MB). The problem is that after printing almost 10% from the original file into the new file, the script dies and the only error I get in the log file is: [Mon Jul 15 14:44:48 2002] [error] [client 127.0.0.1] Premature end of script headers: clean.pl I've checked the original file to see if there are some strange characters in that line, or in the next one, but I haven't found something that might cause that problem. Please tell me why is the script dying only after it runs for a few minutes, and after writing over 11000 lines, and not from the beginning, if there is a problem with the script. Can you find any problems with my scriptt? Probably your web server is timing out the request and killing your script. Web servers don't like to run long-running processes like this. Perhaps you can fork off a child and have the child take care of it. See http://www.stonehenge.com/merlyn/WebTechniques/col20.html for a nice technique on this. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Why Premature end of script headers?
-Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]] Sent: Monday, July 15, 2002 12:23 PM To: 'Octavian Rasnita'; [EMAIL PROTECTED] Subject: Re: Why Premature end of script headers? Probably your web server is timing out the request and killing your script. Web servers don't like to run long-running processes like this. Perhaps you can fork off a child and have the child take care of it. Another solution is to have something like the following in your loop: -- local $|=1; # print things right away (no buffering) # $ltime and $ctime should be defined before the loop $ctime = time(); if ($ltime ne $ctime){ print . ; $ltime = $ctime; } -- Then the browser does not time out because it continually gets information, so it simply prints another period to the browser every one second that it is working. This can help you know its still working. True. But there are two possible downsides to consider: 1. It ties up a server process for an extended period. 2. The server will kill the CGI script if the client goes away or the connection is lost. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Why Premature end of script headers?
Yes, your right. It seems to me there has got to be a more efficient way to accomplish what is being attempted. Right now, the solution is making a copy of the file, then checking every line of the original against every line of the copy. This method becomes exponentially more processor intensive for additional lines to the file. Perhaps a recognize would be to loop through the lines, give them a line number, and write them to a copy. Then sort the copy alphanumerically, then loop through the copy checking one line against the next, and if the next is a duplicate, through it away, and continue checking it against the next until they are not the same. As soon as their not the same, take the line that is not the same and begin checking it against the next line. So on and so forth. Because every like line will be right next to each other, you don't have to check each line against every other line, just the ones next to it. After your done eliminating the duplicates, sort the file again by line number (this only if its necessary to keep them in a certain order), and your done. Granted this method requires that you find some efficient manner of sorting the lines in large text files, but if somebody knows of something that can do this, viola. Regards, David - Original Message - From: Bob Showalter [EMAIL PROTECTED] To: [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: Monday, July 15, 2002 12:00 PM Subject: RE: Why Premature end of script headers? -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]] Sent: Monday, July 15, 2002 12:23 PM To: 'Octavian Rasnita'; [EMAIL PROTECTED] Subject: Re: Why Premature end of script headers? Probably your web server is timing out the request and killing your script. Web servers don't like to run long-running processes like this. Perhaps you can fork off a child and have the child take care of it. Another solution is to have something like the following in your loop: -- local $|=1; # print things right away (no buffering) # $ltime and $ctime should be defined before the loop $ctime = time(); if ($ltime ne $ctime){ print . ; $ltime = $ctime; } -- Then the browser does not time out because it continually gets information, so it simply prints another period to the browser every one second that it is working. This can help you know its still working. True. But there are two possible downsides to consider: 1. It ties up a server process for an extended period. 2. The server will kill the CGI script if the client goes away or the connection is lost. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Subroutines
Hi Everyone, I am writing a program in which I am connecting to an oracle database. I would like to put the environment variables and the connection routine into a separate subroutine, so I dont have to keep re-copying the code. Whats the best way to go about this? Thanks, Theresa Theresa M. Mullin Programmer/Analyst Administrative Computing Northern Essex Community College 100 Elliott Way Haverhill, MA 01830 (978) 556-3757 [EMAIL PROTECTED]
File Storage
Hello all. I just have a quick question. I am writing a small guestbook program to put into a web site. I want to put the information into a DBm file. The web site host says 'cgi-bin access'. I know apache usually runs under its own user. So where am I to put the dbm files at? Shall I create a folder in my home directory and grant write permissions to the apache user on it? Thanks. Jess -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Subroutines
Theresa, Paul Duboise in his book Perl and MySQL puts all connection schemes in a library. Would that work for you? -Original Message- From: Theresa Mullin [mailto:[EMAIL PROTECTED]] Sent: Monday, July 15, 2002 2:31 PM To: [EMAIL PROTECTED] Subject: Subroutines Hi Everyone, I am writing a program in which I am connecting to an oracle database. I would like to put the environment variables and the connection routine into a separate subroutine, so I don't have to keep re-copying the code. What's the best way to go about this? Thanks, Theresa Theresa M. Mullin Programmer/Analyst Administrative Computing Northern Essex Community College 100 Elliott Way Haverhill, MA 01830 (978) 556-3757 [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Displaying counter
Hello all, I know this probably isn't the appropriate list for my question, but I don't want to receive email on other lists too! So, if you know the answer, I'd appreciate your help... I have the following in my HTML output: head meta http-equiv=refresh content=600;URL=program.cgi /head Question: Is there a way to display the counter? I want to have my page say You will be transferred to blah in NNN seconds... I want the message to actually show the active counter... 600 changes to 599, then 598, and so on... Thanks! -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: Displaying counter
Dude, Make it easy on yourself, use an animated GIF. It's close enough. You don't really mean 600 seconds do you? -Original Message- From: Jim Lundeen [mailto:[EMAIL PROTECTED]] Sent: Monday, July 15, 2002 2:58 PM To: begin begin Subject: Displaying counter Hello all, I know this probably isn't the appropriate list for my question, but I don't want to receive email on other lists too! So, if you know the answer, I'd appreciate your help... I have the following in my HTML output: head meta http-equiv=refresh content=600;URL=program.cgi /head Question: Is there a way to display the counter? I want to have my page say You will be transferred to blah in NNN seconds... I want the message to actually show the active counter... 600 changes to 599, then 598, and so on... Thanks! -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: Displaying counter
Jim Lundeen wrote: I have the following in my HTML output: head meta http-equiv=refresh content=600;URL=program.cgi /head Question: Is there a way to display the counter? I want to have my page say You will be transferred to blah in NNN seconds... I want the message to actually show the active counter... 600 changes to 599, then 598, and so on... there's a script available from http://javascript.internet.com that does this very thing. i've used it and it works very well. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
cgi put script?
Hi, I'm trying to get the put function from Netscape, to work with Apache. I have a windows OS (NT server), and I'm using the windows version of perl as a cgi script interpreter. The problem is, the put script won't work with perl. I just get server error messages, because perl doesn't like the commands, or the syntax, in the scripts. I found the scripts from a web site, and a book on Apache. I'd like to first try just finding another put script that will work with perl, in windows. Are there any sources for cgi scripts, I might look at? Failing that, can someone point me to some resources to show me how to write my own put scripts? Thanks, Steve -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]