Re: How to schedule system calls with Python

2009-10-15 Thread Minesh Patel
 Again another great suggestion.  I was not aware of the
 multiprocessing module, and I'm not (yet) sure if I understand why I
 should use instead of multithreading as explained by a previous post.

http://docs.python.org/library/multiprocessing.html

First paragraph...

-- 
Thanks,
--Minesh
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: subprocess hangs on reading stdout

2009-10-14 Thread Minesh Patel

 Any ideas? comments on code welcome also.

Here's something that I would probably do, there may be better ways.
This only works on python2.6 for the terminate() method.


import signal
import subprocess

def timeout_handler(signum, frame):
print About to kill process
p.terminate()

for machine_name in self.alive:
cmd = [/bin/remsh, machine_name, 'ps -flu %s' % uid]
signal.signal(signal.SIGALRM, timeout_handler)
signal.alarm(1)
p = subprocess.Popen(cmd,stdout=subprocess.PIPE)
(stdout, stderr) = p.communicate()
signal.alarm(0)
if stdout:
   print stdout
elif stderr:
   print stderr



-- 
Thanks,
--Minesh
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: how to iterate over several lists?

2009-06-05 Thread Minesh Patel
On Fri, Jun 5, 2009 at 10:20 AM, Tomsevenb...@gmail.com wrote:
 On Fri, 5 Jun 2009 04:07:19 + (UTC), kj no.em...@please.post
 wrote:



Suppose I have two lists, list_a and list_b, and I want to iterate
over both as if they were a single list.  E.g. I could write:

for x in list_a:
    foo(x)
for x in list_b:
    foo(x)

But is there a less cumbersome way to achieve this?  I'm thinking
of something in the same vein as Perl's:

for $x in (@list_a, @list_b) {
  foo($x);
}

TIA!

kynn

 def chain(*args):
  return (item for seq in args for item in seq)

 for x in chain(list_a, list_b):
    foo(x)
 --
 http://mail.python.org/mailman/listinfo/python-list


If they are the same length, you can try the zip built-in function.

-- 
Thanks,
--Minesh
-- 
http://mail.python.org/mailman/listinfo/python-list


ConfigParser and newlines

2009-05-15 Thread Minesh Patel
I am using ConfigParser to parse a config file and I want to maintain
the newlines, how is it possible. Example given below. BTW, is there
an alternative to configParser, should I use 'exec' instead. Is there
any support for yaml built-in or possibly in the future?

test.cfg

[Foo_Section]

BODY = Line of text 1

   Continuing Line of text 1


Executing the code
===
Python 2.5.1 Stackless 3.1b3 060516 (release25-maint, Mar  6 2009, 14:12:34)
[GCC 4.3.0 20080428 (Red Hat 4.3.0-8)] on linux2
Type help, copyright, credits or license for more information.
 from ConfigParser import RawConfigParser
 config = RawConfigParser()
 config.read('test.cfg')
['test.cfg']
 config.get('Foo_Section', 'BODY')
'Line of text 1\nContinuing Line of text 1'

===

I was expecting 'Line of text 1\n\nContinuing Line of text 1'
   
with 2 newlines, how can I achieve that with ConfigParser.



-- 
Thanks,
./Minesh
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: ConfigParser and newlines

2009-05-15 Thread Minesh Patel
 Not as best I can tell.  From my /usr/lib/python2.5/ConfigParser.py file,
 around line 441:

  if line[0].isspace() and cursect is not None and optname:
    value = line.strip()
    if value:
      cursect[optname] = %s\n%s % (cursect[optname], value)

 That value = line.strip() is what's throwing away your extra newline.
  Then the if value refuses to add the extra newline because it was a blank
 line.  It looks like this behavior was intentional?


Is that a bug? What would be the use cases for stripping newlines
unnecessarily. Seems like the configParser is trying to be too clever.

-- 
Thanks,
./Minesh
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Is there is any way to send messages to chunk of emails ID's concurrently using smptlib

2009-05-05 Thread Minesh Patel
On Mon, May 4, 2009 at 2:01 PM, Piet van Oostrum p...@cs.uu.nl wrote:
 gganesh ganesh@gmail.com (g) wrote:

g Hi friends,
g I suppose sendmail() can send mails one by one ,how to send mails
g concurrently ,
g It would be very grateful,if someone could point out a solution.


You can always use twisted which has an smtp library which is used for
concurrency. Not a complete working example but an understanding of
deferreds is required.

Ex:

from twisted.mail import smtp
from twisted.internet import reactor, defer

  def sendEmail(to, message, subject=Testing,
 _from='f...@bar.com'):
  Used to send emails asyncronously to multiple recipients
  msg = MIMEText(message)
  msg['Subject'] = subject
  msg['From'] = _from
  msg['To'] = , .join(to)

  sending = smtp.sendmail('localhost', _from, to, msg)
  sending.addCallback(sendComplete, to).addErrback(sendFailed)

   def sendFailed(error):
  print Email failed: %r % (error.getTraceback(),)

   def sendComplete(result, recipients):
  numDelivered, addrInfo = result
  print addrInfo
  if (numDelivered != len(recipients)):
 log.msg(SmtpError, Not all recipients received email %r % addrInfo)

buf = 'TESTING'
sendEmail(to=['t...@spam.com'], message=buf)


-- 
Thanks,
./Minesh
--
http://mail.python.org/mailman/listinfo/python-list


Re: parallel/concurrent process in python

2009-03-10 Thread Minesh Patel
On Mon, Mar 9, 2009 at 2:47 PM,  ommer.sim...@gmail.com wrote:
 I'm trying to figure out parallel process python code. Something
 similar to fork funtion in C.

 For example, I using sniff tool in scapy to capture packets but i want
 this to run in the background:

 ---
 from scapy.all import *
 import subprocess
 import netsnmp

 pkts=sniff(iface=eth0,filter=UDP,count=100) # This should run in
 the background

 print Next Code.'
 -


 Next Code. should be printed out right away and does not have to
 wait for pkts=sniff(...) to finish.

 Any ideas?

Why not use os.fork(), it is the same as C's fork?

if os.fork(): # Returns 0 to child, non-zero to parent
  # Do parent stuff
else:
  # Do child stuff

-- 
Thanks,
Minesh Patel
--
http://mail.python.org/mailman/listinfo/python-list


Re: parallel/concurrent process in python

2009-03-10 Thread Minesh Patel
 os.fork is not cross platform.  It is *nix only.  Subprocess runs on
 Windows also.  The OP never specified his platform.


Just out of curiosity, how is one able to replace an os.fork() call
with subprocess and have the child execute multiple statements? I
typically see subprocess used for spawning a shell command, piping,
etc...
-- 
Thanks,
Minesh Patel
--
http://mail.python.org/mailman/listinfo/python-list


Concurrent tasklets in Stackless Python

2009-03-09 Thread Minesh Patel
Is there a way for multiple tasklets to run in parallel? I have been
following the examples in
http://members.verizon.net/olsongt/stackless/why_stackless.html but it
seems that tasklets block for data or are scheduled and there is no
way to run them concurrently.

-- 
Thanks,
Minesh Patel
--
http://mail.python.org/mailman/listinfo/python-list


Re: Concurrent tasklets in Stackless Python

2009-03-09 Thread Minesh Patel
On Mon, Mar 9, 2009 at 3:17 PM, Chris Rebert c...@rebertia.com wrote:
 On Mon, Mar 9, 2009 at 3:05 PM, Minesh Patel min...@gmail.com wrote:
 Is there a way for multiple tasklets to run in parallel?

 Seems doubtful (though I'm not an expert).

 From the Wikipedia article: Stackless microthreads are managed by the
 language interpreter itself, not the operating system kernel—context
 switching and task scheduling is done purely in the interpreter. 

 This suggests that only one microthread is ever really being run at
 any given time. I would venture a guess that the lack of true
 parallel-ness is probably a design decision deeply ingrained into
 Stackless and is not changeable. You'd probably have to change to
 multiprocessing or threads.


Thanks Chris,
Can you suggest any Python libraries for true parallelism or should I
just stick with Python Threads or asyncore

Thanks,
Minesh
--
http://mail.python.org/mailman/listinfo/python-list


Should I use stackless python or threads?

2009-03-06 Thread Minesh Patel
I am trying to figure out the best approach to solve this problem:

I want to poll various directories(can be run in the main thread).
Once I notice a file has been added to any directory, I grab a lock,
spawn a thread to go perform the necessary actions, and then release
the lock.

-- 
Thanks for the help,
Minesh Patel
--
http://mail.python.org/mailman/listinfo/python-list


Re: Should I use stackless python or threads?

2009-03-06 Thread Minesh Patel
On Fri, Mar 6, 2009 at 3:16 PM, Jean-Paul Calderone exar...@divmod.com wrote:
 On Fri, 6 Mar 2009 14:50:51 -0800, Minesh Patel min...@gmail.com wrote:

 I am trying to figure out the best approach to solve this problem:

 I want to poll various directories(can be run in the main thread).
 Once I notice a file has been added to any directory, I grab a lock,
 spawn a thread to go perform the necessary actions, and then release
 the lock.

 That's not a description of a problem.  That's a description of a potential
 solution.  What problem are you trying to solve?


I have a build system that is outputting various forms of
installations in a particular directory structure, e.g. /pxe-installs,
/iso-install, /dd-installs, etc...

I need to monitor each directory for the latest install, take it and
go perform some tests on a specific machine. I would like these
testing tasks to run concurrently for the obvious reasons.

Thanks again for the help,
Minesh
--
http://mail.python.org/mailman/listinfo/python-list


Python patch module

2009-02-02 Thread Minesh Patel
Hi,
I was wondering if there is any patch management module for Python.
Basically I am looking to only apply a hunk from a patch if the file
exists.

-- 
Thanks,
Minesh
--
http://mail.python.org/mailman/listinfo/python-list