Re: output to console and to multiple files

2007-02-16 Thread [EMAIL PROTECTED]
On Feb 15, 5:48 pm, Gabriel Genellina [EMAIL PROTECTED] wrote:
 En Thu, 15 Feb 2007 19:35:10 -0300, Matimus [EMAIL PROTECTED] escribió:



  I think you should be able to use my or goodwolf's solution with the
  subprocess module. Something like this (untested):

  [code]
  class TeeFile(object):
  def __init__(self,*files):
  self.files = files
  def write(self,txt):
  for fp in self.files:
  fp.write(txt)

  I tried this at lunch and it doesn't work. Some version of this method
  may work, but Popen tries to call the 'fileno' method of the TeeFile
  object (at least it did on my setup) and it isn't there. This is just
  a preemptive warning before someone comes back to let me know my code
  doesn't work.

 I don't think any Python only solution could work. The pipe options
 available for subprocess are those of the underlying OS, and the OS knows
 nothing about Python file objects.

 --
 Gabriel Genellina

I've tried the subprocess method before without any luck.


Thanks for all your suggestions. I guess it's time to rethink what I
want to do.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-16 Thread Bart Ogryczak
On Feb 14, 11:28 pm, [EMAIL PROTECTED] [EMAIL PROTECTED]
wrote:
 Hello,

 I searched on Google and in this Google Group, but did not find any
 solution to my problem.

 I'm looking for a way to output stdout/stderr (from a subprocess or
 spawn) to screen and to at least two different files.

 eg.
 stdout/stderr - screen
 stdout - log.out
 stderr - log.err

 and if possible
 stdout/stderr - screen and log.txt

 3 files from stdout/stderr

I'd derive a class from file, overwrite it's write() method to send a
copy to the log, and then assign sys.stdout = newFile(sys.stdout).
Same for stderr.





-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-16 Thread Gabriel Genellina
En Fri, 16 Feb 2007 14:04:33 -0300, Bart Ogryczak [EMAIL PROTECTED]  
escribió:

 On Feb 14, 11:28 pm, [EMAIL PROTECTED] [EMAIL PROTECTED]
 wrote:

 I'm looking for a way to output stdout/stderr (from a subprocess or
 spawn) to screen and to at least two different files.

 I'd derive a class from file, overwrite it's write() method to send a
 copy to the log, and then assign sys.stdout = newFile(sys.stdout).
 Same for stderr.

That's ok inside the same process, but the OP needs to use it from a  
subprocess or spawn.
You have to use something like tee, working with real file handles.

-- 
Gabriel Genellina

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-16 Thread garrickp
On Feb 16, 3:28 pm, Gabriel Genellina [EMAIL PROTECTED] wrote:


 That's ok inside the same process, but the OP needs to use it from a
 subprocess or spawn.
 You have to use something like tee, working with real file handles.


I'm not particularly familiar with this, but it seems to me that if
you're trying to catch stdout/stderr from a program you can call with
(say) popen2, you could just read from the returned stdout/stderr
pipe, and then write to a series of file handles (including
sys.stdout).

Or am I missing something? =)

~G

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-16 Thread [EMAIL PROTECTED]
On Feb 16, 4:07 pm, [EMAIL PROTECTED] wrote:
 On Feb 16, 3:28 pm, Gabriel Genellina [EMAIL PROTECTED] wrote:



  That's ok inside the same process, but the OP needs to use it from a
  subprocess or spawn.
  You have to use something like tee, working with real file handles.

 I'm not particularly familiar with this, but it seems to me that if
 you're trying to catch stdout/stderr from a program you can call with
 (say) popen2, you could just read from the returned stdout/stderr
 pipe, and then write to a series of file handles (including
 sys.stdout).

 Or am I missing something? =)

 ~G

That works, but it isn't live streaming of stdout/stderr. Most of the
time, if you stream both, one could lock the process, or have the
stdout/stderr printed in the wrong order.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-16 Thread Fuzzyman
On Feb 16, 11:37 pm, [EMAIL PROTECTED] [EMAIL PROTECTED]
wrote:
 On Feb 16, 4:07 pm, [EMAIL PROTECTED] wrote:



  On Feb 16, 3:28 pm, Gabriel Genellina [EMAIL PROTECTED] wrote:

   That's ok inside the same process, but the OP needs to use it from a
   subprocess or spawn.
   You have to use something like tee, working with real file handles.

  I'm not particularly familiar with this, but it seems to me that if
  you're trying to catch stdout/stderr from a program you can call with
  (say) popen2, you could just read from the returned stdout/stderr
  pipe, and then write to a series of file handles (including
  sys.stdout).

  Or am I missing something? =)

  ~G

 That works, but it isn't live streaming of stdout/stderr. Most of the
 time, if you stream both, one could lock the process, or have the
 stdout/stderr printed in the wrong order.

Everytime I've looked to do something like this (non-blocking read on
the stdout of a subprocess) I've always come back to the conclusion
that threads and queues are the only reasonable way (particularly on
windows). There may be a better solution using select.

Fuzzyman
http://www.voidspace.org.uk/python/articles.shtml

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-15 Thread [EMAIL PROTECTED]
On Feb 14, 5:10 pm, goodwolf [EMAIL PROTECTED] wrote:
 like this?

 class Writers (object):

 def __init__(self, *writers):
 self.writers = writers

 def write(self, string):
 for w in self.writers:
 w.write(string)

 def flush(self):
 for w in self.writers:
 w.flush():

 import sys

 logfile = open('log.txt', 'w')
 sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
 sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)


i've tried simliar methods to this and to what Matimus wrote. I know
it works great when using print statements.
However, I'm looking to find something that will work with the output
from a subprocess, such as from spawn, os.system, os.popen, etc.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-15 Thread [EMAIL PROTECTED]
On Feb 14, 6:52 pm, Gabriel Genellina [EMAIL PROTECTED] wrote:
 En Wed, 14 Feb 2007 19:28:34 -0300, [EMAIL PROTECTED]
 [EMAIL PROTECTED] escribió:

  I'm looking for a way to output stdout/stderr (from a subprocess or
  spawn) to screen and to at least two different files.

 Look at the tee command. If you control the subprocess, and it's written
 in Python, using the Python recipes would be easier and perhaps you have
 more control.
 But if you can't modify the subprocess, you'll have to use tee.

 --
 Gabriel Genellina

Tee, the unix function? Or is there a tee that is python?

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-15 Thread Matimus
On Feb 15, 7:53 am, [EMAIL PROTECTED] [EMAIL PROTECTED]
wrote:
 On Feb 14, 5:10 pm, goodwolf [EMAIL PROTECTED] wrote:



  like this?

  class Writers (object):

  def __init__(self, *writers):
  self.writers = writers

  def write(self, string):
  for w in self.writers:
  w.write(string)

  def flush(self):
  for w in self.writers:
  w.flush():

  import sys

  logfile = open('log.txt', 'w')
  sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
  sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)

 i've tried simliar methods to this and to what Matimus wrote. I know
 it works great when using print statements.
 However, I'm looking to find something that will work with the output
 from a subprocess, such as from spawn, os.system, os.popen, etc.

I think you should be able to use my or goodwolf's solution with the
subprocess module. Something like this (untested):

[code]
class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)

if __name__ == __main__:
import sys
from subprocess import Popen

command = whatever you want to run
outf = file(log.out,w)
errf = file(log.err,w)
allf = file(log.txt,w)
Popen(
command,
stdout = TeeFile(sys.__stdout__,outf,allf),
stderr = TeeFile(sys.__stderr__,errf,allf)
)
[/code]

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-15 Thread Matimus
On Feb 15, 8:51 am, Matimus [EMAIL PROTECTED] wrote:
 On Feb 15, 7:53 am, [EMAIL PROTECTED] [EMAIL PROTECTED]
 wrote:



  On Feb 14, 5:10 pm, goodwolf [EMAIL PROTECTED] wrote:

   like this?

   class Writers (object):

   def __init__(self, *writers):
   self.writers = writers

   def write(self, string):
   for w in self.writers:
   w.write(string)

   def flush(self):
   for w in self.writers:
   w.flush():

   import sys

   logfile = open('log.txt', 'w')
   sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
   sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)

  i've tried simliar methods to this and to what Matimus wrote. I know
  it works great when using print statements.
  However, I'm looking to find something that will work with the output
  from a subprocess, such as from spawn, os.system, os.popen, etc.

 I think you should be able to use my or goodwolf's solution with the
 subprocess module. Something like this (untested):

 [code]
 class TeeFile(object):
 def __init__(self,*files):
 self.files = files
 def write(self,txt):
 for fp in self.files:
 fp.write(txt)

 if __name__ == __main__:
 import sys
 from subprocess import Popen

 command = whatever you want to run
 outf = file(log.out,w)
 errf = file(log.err,w)
 allf = file(log.txt,w)
 Popen(
 command,
 stdout = TeeFile(sys.__stdout__,outf,allf),
 stderr = TeeFile(sys.__stderr__,errf,allf)
 )
 [/code]

I tried this at lunch and it doesn't work. Some version of this method
may work, but Popen tries to call the 'fileno' method of the TeeFile
object (at least it did on my setup) and it isn't there. This is just
a preemptive warning before someone comes back to let me know my code
doesn't work.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-15 Thread Gabriel Genellina
En Thu, 15 Feb 2007 19:35:10 -0300, Matimus [EMAIL PROTECTED] escribió:

 I think you should be able to use my or goodwolf's solution with the
 subprocess module. Something like this (untested):

 [code]
 class TeeFile(object):
 def __init__(self,*files):
 self.files = files
 def write(self,txt):
 for fp in self.files:
 fp.write(txt)


 I tried this at lunch and it doesn't work. Some version of this method
 may work, but Popen tries to call the 'fileno' method of the TeeFile
 object (at least it did on my setup) and it isn't there. This is just
 a preemptive warning before someone comes back to let me know my code
 doesn't work.

I don't think any Python only solution could work. The pipe options  
available for subprocess are those of the underlying OS, and the OS knows  
nothing about Python file objects.

-- 
Gabriel Genellina

-- 
http://mail.python.org/mailman/listinfo/python-list


output to console and to multiple files

2007-02-14 Thread [EMAIL PROTECTED]
Hello,

I searched on Google and in this Google Group, but did not find any
solution to my problem.

I'm looking for a way to output stdout/stderr (from a subprocess or
spawn) to screen and to at least two different files.

eg.
stdout/stderr - screen
stdout - log.out
stderr - log.err

and if possible
stdout/stderr - screen and log.txt

3 files from stdout/stderr

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-14 Thread Matimus
I took a look around and I couldn't find anything either. I will be
keeping an eye on this thread to see if someone posts a more standard
solution. In the mean time though, I will offer up a potential
solution. Duck typing is your friend. If you are only using the write
method of your files, it can be pretty simple to implement a fake file
object to do what you want.

[code]
import sys

class TeeFile(object):
def __init__(self,*files):
self.files = files
def write(self,txt):
for fp in self.files:
fp.write(txt)

if __name__ == __main__:
outf = file(log.out,w)
errf = file(log.err,w)
allf = file(log.txt,w)
sys.stdout = TeeFile(sys.__stdout__,outf,allf)
sys.stderr = TeeFile(sys.__stderr__,errf,allf)

print hello world this is stdout
print  sys.stderr , hello world this is stderr
[/code]

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-14 Thread goodwolf
like this?

class Writers (object):

def __init__(self, *writers):
self.writers = writers

def write(self, string):
for w in self.writers:
w.write(string)

def flush(self):
for w in self.writers:
w.flush():

import sys

logfile = open('log.txt', 'w')
sys.stdout = Writers(aya.stdout, file('log.out', 'w'), logfile)
sys.stderr = Writers(aya.stdout, file('log.err', 'w'), logfile)

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: output to console and to multiple files

2007-02-14 Thread Gabriel Genellina
En Wed, 14 Feb 2007 19:28:34 -0300, [EMAIL PROTECTED]  
[EMAIL PROTECTED] escribió:

 I'm looking for a way to output stdout/stderr (from a subprocess or
 spawn) to screen and to at least two different files.

Look at the tee command. If you control the subprocess, and it's written  
in Python, using the Python recipes would be easier and perhaps you have  
more control.
But if you can't modify the subprocess, you'll have to use tee.

-- 
Gabriel Genellina

-- 
http://mail.python.org/mailman/listinfo/python-list