Re: multiprocessing and accessing server's stdout

2010-06-02 Thread Bryan
I wrote:
> So what you really need is to capture the output of a command, in this
> case LaTeX, so you can copy it back to the client. You can do that
> with the subprocess module in the Python standard library.
>
> If the command generated so much output so fast that you felt the need
> to avoid the extra copy, I suppose you could fork() then hook stdout
> directly to socket connected to the client with dup2(), then exec()
> the command. But no need for that just to capture LaTeX's output.

Upon further reading, I see that the subprocess module makes the
direct-hookup method easy, at least on 'nix systems. Just tell
subprocess.Popen to use the client-connected socket as the
subprocess's stdout.

The question here turns out to make more sense than I had though upon
reading the first post. The server runs a command at the client's
request, and we want to deliver the output of that command back to the
client. A brilliantly efficient method is to direct the command's
stdout to the client's connection.

Below is a demo server that sends the host's words file to any client
that connects. It assumes Unix.


--Bryan Olson


#!/usr/bin/python

from thread import start_new_thread
from subprocess import Popen


def demo(sock):
subp = Popen(['cat', '/usr/share/dict/words'], stdout=sock)
subp.wait()
sock.close()

if __name__ == '__main__':
listener_sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
listener_sock.bind(('', 54321))
listener_sock.listen(5)
while True:
sock, remote_address = listener_sock.accept()
start_new_thread(demo, (sock,))
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-06-02 Thread Bryan
Tim Arnold wrote:
> Hi, This is the setup I was asking about.
> I've got users using a python-written command line client. They're
> requesting services from a remote server that fires a LaTeX process. I
> want them to see the stdout from the LaTeX process.

So what you really need is to capture the output of a command, in this
case LaTeX, so you can copy it back to the client. You can do that
with the subprocess module in the Python standard library.

If the command generated so much output so fast that you felt the need
to avoid the extra copy, I suppose you could fork() then hook stdout
directly to socket connected to the client with dup2(), then exec()
the command. But no need for that just to capture LaTeX's output.


--
--Bryan Olson
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-06-01 Thread Tim Arnold
On May 28, 7:47 pm, "Martin P. Hellwig" 
wrote:
> On 05/28/10 21:44, Adam Tauno Williams wrote:
>
> > On Fri, 2010-05-28 at 15:41 +0100, Martin P. Hellwig wrote:
> >> On 05/28/10 13:17, Adam Tauno Williams wrote:
> >> 
> >>> You should be able to point it any any file-like object.  But, again,
> >>> why?
> >>> If you have the data in the process why send it tostdoutand redirect
> >>> it.  Why not just send the data to the client directly?
> >> Well you might want to multiplex it to more then one client, not saying
> >> that this is the case here, just something I imagine possible.
>
> > That still doesn't make sense.  Why 'multiplexstdout'?  Why not just
> > multiplex the data into proper IPC channels in the first place?
>
> I am going on a stretch here, I mostly agree with you, just trying to
> illustrate that there could be corner cases where this is sensible.
> The current situation could be that there is a client/server program
> (binary only perhaps) which is not multi-user safe.
>
> Python can be used as a wrapper around the server to make it
> multi-client, by emulating the exact behavior towards the client, the
> client program does not have to be changed.
>
> --
> mph

Hi, This is the setup I was asking about.
I've got users using a python-written command line client. They're
requesting services from a remote server that fires a LaTeX process. I
want them to see the stdout from the LaTeX process.

I was using multiprocessing to handle the requests, but the stdout
shows up on the server's terminal window where I started the
server.serve_forever process.

I started using RPyC and now the stdout appears on the client terminal
making the request.

I was trying to minimize the number of packages I use, hoping I could
get the same capability from multiprocessing that I get with RPyC.

thanks for the comments. I'm still processing what's been written
here.
--Tim

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-05-28 Thread Martin P. Hellwig

On 05/28/10 21:44, Adam Tauno Williams wrote:

On Fri, 2010-05-28 at 15:41 +0100, Martin P. Hellwig wrote:

On 05/28/10 13:17, Adam Tauno Williams wrote:


You should be able to point it any any file-like object.  But, again,
why?
If you have the data in the process why send it to stdout and redirect
it.  Why not just send the data to the client directly?

Well you might want to multiplex it to more then one client, not saying
that this is the case here, just something I imagine possible.


That still doesn't make sense.  Why 'multiplex stdout'?  Why not just
multiplex the data into proper IPC channels in the first place?


I am going on a stretch here, I mostly agree with you, just trying to 
illustrate that there could be corner cases where this is sensible.
The current situation could be that there is a client/server program 
(binary only perhaps) which is not multi-user safe.


Python can be used as a wrapper around the server to make it 
multi-client, by emulating the exact behavior towards the client, the 
client program does not have to be changed.


--
mph
--
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-05-28 Thread Adam Tauno Williams
On Fri, 2010-05-28 at 15:41 +0100, Martin P. Hellwig wrote:
> On 05/28/10 13:17, Adam Tauno Williams wrote:
> 
> > You should be able to point it any any file-like object.  But, again,
> > why?
> > If you have the data in the process why send it to stdout and redirect
> > it.  Why not just send the data to the client directly?
> Well you might want to multiplex it to more then one client, not saying 
> that this is the case here, just something I imagine possible.

That still doesn't make sense.  Why 'multiplex stdout'?  Why not just
multiplex the data into proper IPC channels in the first place?
-- 
Adam Tauno Williams  LPIC-1, Novell CLA

OpenGroupware, Cyrus IMAPd, Postfix, OpenLDAP, Samba

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-05-28 Thread Martin P. Hellwig

On 05/28/10 13:17, Adam Tauno Williams wrote:



You should be able to point it any any file-like object.  But, again,
why?

If you have the data in the process why send it to stdout and redirect
it.  Why not just send the data to the client directly?


Well you might want to multiplex it to more then one client, not saying 
that this is the case here, just something I imagine possible.


--
mph
--
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-05-28 Thread Adam Tauno Williams
On Thu, 2010-05-27 at 08:36 -0700, Tim Arnold wrote:
> On May 26, 4:52 pm, Adam Tauno Williams 
> wrote:
> > On Wed, 2010-05-26 at 11:47 -0700, Tim Arnold wrote:
> > > Hi,
> > > I'm using multiprocessing's BaseManager to create a server on one
> > > machine and a client on another. The client fires a request and the
> > > server does some work, the result of which ends up on a shared file
> > > system that both the client and server can see.
> > > However, I need the client machine to see the stdout of the process
> > > running on the server. Not sure this is doable--I've been unable to
> > > google anything useful on this one.
> >
> > Nope, it isn't.  Don't use stdout, use an IPC mechanism to communicate
> > between the client and the server if you need feedback.
> Thanks for that info, it saves me some time. This is a new area for me
> though: do you redirect stdout on the server to a socket and have the
> client listen and somehow pipe the sockets contents to the client
> stdout?

No, I close stdin, stderr, and stdout on the server processes and attach
them to /dev/null.  Just don't use stdout.

> Interestingly, the RPYc package manages it--that is, the client gets
> the stdout of the server process, so I'll dig into that code to get an
> idea. In the meantime, are there any recipes or other docs that would
> be helpful? I've been googling but without much luck.

Closing stdout and attaching it to any other file descriptor is pretty
simple.

sys.stdout = open('/dev/null', 'w')

You should be able to point it any any file-like object.  But, again,
why?

If you have the data in the process why send it to stdout and redirect
it.  Why not just send the data to the client directly?
-- 
Adam Tauno Williams  LPIC-1, Novell CLA

OpenGroupware, Cyrus IMAPd, Postfix, OpenLDAP, Samba

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-05-27 Thread Tim Arnold
On May 26, 4:52 pm, Adam Tauno Williams 
wrote:
> On Wed, 2010-05-26 at 11:47 -0700, Tim Arnold wrote:
> > Hi,
> > I'm using multiprocessing's BaseManager to create a server on one
> > machine and a client on another. The client fires a request and the
> > server does some work, the result of which ends up on a shared file
> > system that both the client and server can see.
> > However, I need the client machine to see the stdout of the process
> > running on the server. Not sure this is doable--I've been unable to
> > google anything useful on this one.
>
> Nope, it isn't.  Don't use stdout, use an IPC mechanism to communicate
> between the client and the server if you need feedback.
> --
> Adam Tauno Williams  LPIC-1, Novell CLA
> 
> OpenGroupware, Cyrus IMAPd, Postfix, OpenLDAP, Samba

Thanks for that info, it saves me some time. This is a new area for me
though: do you redirect stdout on the server to a socket and have the
client listen and somehow pipe the sockets contents to the client
stdout?

Interestingly, the RPYc package manages it--that is, the client gets
the stdout of the server process, so I'll dig into that code to get an
idea. In the meantime, are there any recipes or other docs that would
be helpful? I've been googling but without much luck.

thanks,
--Tim
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: multiprocessing and accessing server's stdout

2010-05-26 Thread Adam Tauno Williams
On Wed, 2010-05-26 at 11:47 -0700, Tim Arnold wrote:
> Hi,
> I'm using multiprocessing's BaseManager to create a server on one
> machine and a client on another. The client fires a request and the
> server does some work, the result of which ends up on a shared file
> system that both the client and server can see.
> However, I need the client machine to see the stdout of the process
> running on the server. Not sure this is doable--I've been unable to
> google anything useful on this one.

Nope, it isn't.  Don't use stdout, use an IPC mechanism to communicate
between the client and the server if you need feedback.
-- 
Adam Tauno Williams  LPIC-1, Novell CLA

OpenGroupware, Cyrus IMAPd, Postfix, OpenLDAP, Samba

-- 
http://mail.python.org/mailman/listinfo/python-list


multiprocessing and accessing server's stdout

2010-05-26 Thread Tim Arnold
Hi,
I'm using multiprocessing's BaseManager to create a server on one
machine and a client on another. The client fires a request and the
server does some work, the result of which ends up on a shared file
system that both the client and server can see.

However, I need the client machine to see the stdout of the process
running on the server. Not sure this is doable--I've been unable to
google anything useful on this one.

thanks,
--Tim Arnold

-- 
http://mail.python.org/mailman/listinfo/python-list