Re: reimport module every n seconds

2011-02-18 Thread Santiago Caracol
> > Don't do that.  ;-)  I suggest using exec instead.  However, I would be > > surprised if import worked faster than, say, JSON (more precisely, I > > doubt that it's enough faster to warrnat this kludge). > > I'm with Aahz.  Don't do that. > > I don't know what you're doing, but I suspect an eve

reimport module every n seconds

2011-02-17 Thread Santiago Caracol
Hello, a server program of mine uses data which are compiled to a Python module for efficiency reasons. In some module of the server program I import the data: from data import data As the data often changes, I would like to reimport it every n (e.g. 10) seconds. Unfortunately, it is rather dif

do something every n seconds

2010-11-25 Thread Santiago Caracol
Hello, how can I do something (e.g. check if new files are in the working directory) every n seconds in Python? Santiago -- http://mail.python.org/mailman/listinfo/python-list

Re: sys.setrecursionlimit() and regular expressions

2010-09-30 Thread Santiago Caracol
> Why do you think so? The recursion limit has no effect on the speed of your > script. It's just a number that the interpreter checks against. Yes, sorry. I was just about to explain that. The 'of course' in my post was silly. In MY program, the recursion limit is relevant for performance, beca

sys.setrecursionlimit() and regular expressions

2010-09-30 Thread Santiago Caracol
Hello, in my program I use recursive functions. A recursion limit of 10 would be by far sufficient. Yet, I also use some (not very complicated) regular expressions, which only compile if I set the recursion limit to 100, or so. This means, of course, an unnecessary loss of speed. Can I use one re

Re: freeze function calls

2010-08-11 Thread Santiago Caracol
Peter, thanks again for all this code. You helped me a lot. > Didn't you say you weren't interested in the web specific aspects? I thought that, although my problem had to do with client-server stuff, it wasn't really web-specific. But now I think that that was part of my problem. I failed to se

Re: freeze function calls

2010-08-10 Thread Santiago Caracol
> Run the above with > > $ python wsgi_demo.py > Serving on port 8000... > Thanks a lot for this code. The problem with it is that the whole application IS a generator function. That means that if I run the code at, say foo.org, then any user that visits the site will augment the answer number of

Re: freeze function calls

2010-08-10 Thread Santiago Caracol
> Python offers an elegant mechanism to calculate values on demand: the > generator function: > > >>> def calculate_answers(): > > ...     for i in range(100): > ...             print "calculating answer #%d" % i > ...             yield i * i > ... > Thanks for pointing this out. I was aware of th

run subprocesses in parallel

2010-08-02 Thread Santiago Caracol
Hello, I want to run several subprocesses. Like so: p1 = Popen("mycmd1" + " myarg", shell=True) p2 = Popen("mycmd2" + " myarg", shell=True) ... pn = Popen("mycmdn" + " myarg", shell=True) What would be the most elegant and secure way to run all n subprocesses in parallel? Santiago -- http://ma