Running long script in the background

2007-02-06 Thread [EMAIL PROTECTED]
Hello,

I am trying to write a python cgi that calls a script over ssh, the
problem is the script takes a very long time to execute so Apache
makes the CGI time out and I never see any output.  The script is set
to print a progress report to stdout every 3 seconds but I never see
any output until the child process is killed.

Here's what I have in my python script:

command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
(host, domuname)
output = os.popen(command)
for line in output:
   print line.strip()

Here's a copy of the bash script.

http://watters.ws/script.txt

I also tried using os.spawnv to run ssh in the background and nothing
happens.

Does anybody know a way to make output show in real time?

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread jasonmc
> Does anybody know a way to make output show in real time?

You can put: #!/usr/bin/python -u
at the top of the script to have unbuffered binary stdout and stderr.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread Thomas Guettler
[EMAIL PROTECTED] wrote:

> Hello,
>
> I am trying to write a python cgi that calls a script over ssh, the
> problem is the script takes a very long time to execute so Apache
> makes the CGI time out and I never see any output.  The script is set
> to print a progress report to stdout every 3 seconds but I never see
> any output until the child process is killed.
>
> Here's what I have in my python script:
>
> command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
> (host, domuname)
> output = os.popen(command)
> for line in output:
>print line.strip()

try sys.stdout.flush() after every print.

Or try something like this:
import sys, time

class FlushFile:
def __init__(self, fd):
self.fd = fd
def flush(self):
self.fd.flush()
def write(self, str):
self.fd.write(str)
self.fd.flush()

oldstdout = sys.stdout
sys.stdout = FlushFile(sys.stdout)

for i in range(5):
print "Hello",
time.sleep(0.5)
print 

-- 
Thomas Güttler, http://www.thomas-guettler.de/ http://www.tbz-pariv.de/
E-Mail: guettli (*) thomas-guettler + de
Spam Catcher: [EMAIL PROTECTED]

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread [EMAIL PROTECTED]
On Feb 6, 8:36 am, "jasonmc" <[EMAIL PROTECTED]> wrote:
> > Does anybody know a way to make output show in real time?
>
> You can put: #!/usr/bin/python -u
> at the top of the script to have unbuffered binary stdout and stderr.


Thanks.  I tried that but it still times out waiting for output.

Everything works fine until I call the popen function, then it
freezes.  What I want is to print the output in real time, just like
it does when I run it from a shell.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread [EMAIL PROTECTED]
On Feb 6, 10:37 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> On Feb 6, 8:36 am, "jasonmc" <[EMAIL PROTECTED]> wrote:
>
> > > Does anybody know a way to make output show in real time?
>
> > You can put: #!/usr/bin/python -u
> > at the top of the script to have unbuffered binary stdout and stderr.
>
> Thanks.  I tried that but it still times out waiting for output.
>
> Everything works fine until I call the popen function, then it
> freezes.  What I want is to print the output in real time, just like
> it does when I run it from a shell.


I tried flushing stdout and the same thing happens.  As soon as the
os.popen(command) line runs it stops there, the next print statement
never even runs.

I've also tried using os.spawnv to make the process run in the
background but then the ssh command never runs.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread Erik Max Francis
[EMAIL PROTECTED] wrote:

> I tried flushing stdout and the same thing happens.  As soon as the
> os.popen(command) line runs it stops there, the next print statement
> never even runs.
> 
> I've also tried using os.spawnv to make the process run in the
> background but then the ssh command never runs.

Based on what you describe, this isn't a good application for a 
single-transaction CGI exchange.  The timeouts are not happening at the 
level of your CGI script, but rather either at the HTTP server itself or 
at the remote client.  In either case, fixing it as a one-transaction, 
one-script solution is not going to be very feasible.

A more sensible way to do it is to have one logical page (which could be 
the same physical page if you want) which accepts job requests, spawns 
them off in the background, and offers a link to a second logical page 
which sees if the job has completed -- showing the results if it has -- 
or refreshes periodically if it hasn't yet.

-- 
Erik Max Francis && [EMAIL PROTECTED] && http://www.alcyone.com/max/
  San Jose, CA, USA && 37 20 N 121 53 W && AIM, Y!M erikmaxfrancis
   You could have another fate / You could be in another place
-- Anggun
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread [EMAIL PROTECTED]
On Feb 6, 2:02 pm, Dennis Lee Bieber <[EMAIL PROTECTED]> wrote:
> On 6 Feb 2007 07:37:33 -0800, "[EMAIL PROTECTED]"
> <[EMAIL PROTECTED]> declaimed the following in comp.lang.python:
>
>
>
> > Everything works fine until I call the popen function, then it
> > freezes.  What I want is to print the output in real time, just like
> > it does when I run it from a shell.
>
> And you want /this/ in a web page?
>
> I don't think HTTP is designed for that... As I understand it, it
> expects to get a complete page back and then the transaction is complete
> and forgotten (except for the presence of session cookies). To report
> dynamically on a web page tends to either be something like a
> timed-redirect (reload) of the same URL with the cookie, and that is a
> completely separate transaction starting a new CGI (or equivalent)
> process. AJAX techniques may clean up some of this -- by not really
> reloading the whole page, instead updating the DOM based upon data
> transferred.


Web pages can show output as it's sent.  For testing I created a
script on the server that untars a 600 meg volume, I can see each file
name show up in my browser instantly, just like it should.  The other
script I'm trying to run won't show anything until the entire process
is complete and it's just a bunch of echo statements in a for loop,
I'm not sure why they behave differently.



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread Erik Max Francis
[EMAIL PROTECTED] wrote:

> Web pages can show output as it's sent.  For testing I created a
> script on the server that untars a 600 meg volume, I can see each file
> name show up in my browser instantly, just like it should.  The other
> script I'm trying to run won't show anything until the entire process
> is complete and it's just a bunch of echo statements in a for loop,
> I'm not sure why they behave differently.

In a word:  buffering.

-- 
Erik Max Francis && [EMAIL PROTECTED] && http://www.alcyone.com/max/
  San Jose, CA, USA && 37 20 N 121 53 W && AIM, Y!M erikmaxfrancis
   You could have another fate / You could be in another place
-- Anggun
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread Gabriel Genellina
En Tue, 06 Feb 2007 16:44:52 -0300, [EMAIL PROTECTED]  
<[EMAIL PROTECTED]> escribió:

> On Feb 6, 2:02 pm, Dennis Lee Bieber <[EMAIL PROTECTED]> wrote:
>> On 6 Feb 2007 07:37:33 -0800, "[EMAIL PROTECTED]"
>> <[EMAIL PROTECTED]> declaimed the following in comp.lang.python:
>>
>> > Everything works fine until I call the popen function, then it
>> > freezes.  What I want is to print the output in real time, just like
>> > it does when I run it from a shell.
>>
>> And you want /this/ in a web page?
>>
>> I don't think HTTP is designed for that... As I understand it,  
>> it
>> expects to get a complete page back and then the transaction is complete
>> and forgotten (except for the presence of session cookies). To report

If the response does not include a Content-Length header, and has a  
Transfer-Encoding: chunked header, then it is sent in chunks (blocks) and  
the client is able to process it piece by piece.
See the server docs on how to enable and generate a chunked response. On  
Zope 2, by example, it's enough to use response.write().

> Web pages can show output as it's sent.  For testing I created a
> script on the server that untars a 600 meg volume, I can see each file
> name show up in my browser instantly, just like it should.  The other
> script I'm trying to run won't show anything until the entire process
> is complete and it's just a bunch of echo statements in a for loop,
> I'm not sure why they behave differently.

Are you sure the other process is executing? and not buffered? and you're  
reading its output line by line?

-- 
Gabriel Genellina

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread [EMAIL PROTECTED]
On Feb 6, 5:26 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> Hello,
>
> I am trying to write a python cgi that calls a script over ssh, the
> problem is the script takes a very long time to execute so Apache
> makes the CGI time out and I never see any output.  The script is set
> to print a progress report to stdout every 3 seconds but I never see
> any output until the child process is killed.
>

>
> Does anybody know a way to make output show in real time?

Try this:



# test.py
import os
import sys
import time

def command():
for x in range(5):
print x
sys.stdout.flush()
time.sleep(1)

def main():
command = 'python -c "import test; test.command()"'
print 'running: %s' % command
output = os.popen(command, 'r', 1)
while True:
line = output.readline()
if line == '':
break
sys.stdout.write(line)
sys.stdout.flush()

if __name__ == '__main__':
main()



The problem is with using the file-like object returned by popen as an
iterator. It will block until the child process is killed, so just
iterate across it manually.

Pete

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-06 Thread [EMAIL PROTECTED]
On Feb 6, 11:13 pm, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
wrote:

> output = os.popen(command, 'r', 1)

OOPS... I imagine the ridiculous buffer size is unnecessary... I was
trying to get it to work with the original for loop iterating on
output, it should work fine without it.

Pete

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-07 Thread Jordan
On Feb 6, 8:26 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> Hello,
>
> I am trying to write a python cgi that calls a script over ssh, the
> problem is the script takes a very long time to execute so Apache
> makes the CGI time out and I never see any output.  The script is set
> to print a progress report to stdout every 3 seconds but I never see
> any output until the child process is killed.
>
> Here's what I have in my python script:
>
> command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
> (host, domuname)
> output = os.popen(command)
> for line in output:
>print line.strip()
>
> Here's a copy of the bash script.
>
> http://watters.ws/script.txt
>
> I also tried using os.spawnv to run ssh in the background and nothing
> happens.
>
> Does anybody know a way to make output show in real time?

Just a little note: os.popen has been replaced by the subprocess
module.  ;D

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-07 Thread Karthik Gurusamy
On Feb 6, 5:26 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> Hello,
>
> I am trying to write a python cgi that calls a script over ssh, the
> problem is the script takes a very long time to execute so Apache
> makes the CGI time out and I never see any output.  The script is set
> to print a progress report to stdout every 3 seconds but I never see
> any output until the child process is killed.
>
> Here's what I have in my python script:
>
> command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
> (host, domuname)
> output = os.popen(command)

Apart from other buffering issues, it could be very well that ssh
returns all the output in one  single big chunk. Try running the ssh
command (with the trailing 'command') from your shell and see if it
generates output immediately.

There may be some option to make ssh not buffer the data it reads from
the remove command execution. If there is no such option, most likely
you are out of luck. In this case, even if you making your remote
script unbufferred, ssh may be buffering it.

If both the machines have any shared filesystem, you can do a trick.
Make your script write it's output unbuffered to a file. Since the
file is mounted and available on both the machines.. start reading the
file from this main python script (note that you may need a thread to
do it, as your script will anyway be stuck waiting for the ssh to
complete).

Karthik

> for line in output:
>print line.strip()
>
> Here's a copy of the bash script.
>
> http://watters.ws/script.txt
>
> I also tried using os.spawnv to run ssh in the background and nothing
> happens.
>
> Does anybody know a way to make output show in real time?


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Running long script in the background

2007-02-07 Thread [EMAIL PROTECTED]
On Feb 8, 10:42 am, "Karthik Gurusamy" <[EMAIL PROTECTED]> wrote:
> On Feb 6, 5:26 am, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
>
> > Hello,
>
> > I am trying to write a python cgi that calls a script over ssh, the
> > problem is the script takes a very long time to execute so Apache
> > makes the CGI time out and I never see any output.  The script is set
> > to print a progress report to stdout every 3 seconds but I never see
> > any output until the child process is killed.
>
> > Here's what I have in my python script:
>
> > command = "ssh -l root %s /scripts/xen/xen-create-win-vps1.sh %s" %
> > (host, domuname)
> > output = os.popen(command)
>
> Apart from other buffering issues, it could be very well that ssh
> returns all the output in one  single big chunk. Try running the ssh
> command (with the trailing 'command') from your shell and see if it
> generates output immediately.
>
> There may be some option to make ssh not buffer the data it reads from
> the remove command execution. If there is no such option, most likely
> you are out of luck. In this case, even if you making your remote
> script unbufferred, ssh may be buffering it.
>
> If both the machines have any shared filesystem, you can do a trick.
> Make your script write it's output unbuffered to a file. Since the
> file is mounted and available on both the machines.. start reading the
> file from this main python script (note that you may need a thread to
> do it, as your script will anyway be stuck waiting for the ssh to
> complete).
>
> Karthik
>
> > for line in output:
> >print line.strip()
>
> > Here's a copy of the bash script.
>
> >http://watters.ws/script.txt
>
> > I also tried using os.spawnv to run ssh in the background and nothing
> > happens.
>
> > Does anybody know a way to make output show in real time?


You could also try flushing the buffer after each status message

-- 
http://mail.python.org/mailman/listinfo/python-list