py.log using decorators for DRY

2005-10-29 Thread yoda
I'm using py.log for logging and I find that I end up having the
following pattern emerge within my code (influenced by
http://agiletesting.blogspot.com/2005/06/keyword-based-logging-with-py-library.html):


def foo(**kwargs):
log.foo(kwargs)
#body form

This led me to believe that I could simplify that pattern with the
following idiom :


def logit (fn):
'''
decorator to enable logging of all tagged methods
'''
def decorator (**kwargs):
# call a method named fn.func_name on log with kwargs
#should be something like: log.func_name (kwargs)

return decorator


I can then do add @logit to all my existing methods via a script
(there's a truck load of methods to tag):


@logit
def oldfoo () : pass


My question is in regards to the body form in the decorator.  How do I
call that method on the log object at runtime?

(ps.  I hope my question is clear $)

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: py.log using decorators for DRY

2005-10-29 Thread yoda
I feel so stupid... lol... now why didn't I think of that?

Thanks Alex.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: setuptools, ez_setup over http proxy

2005-10-14 Thread yoda
Thanks guys,
Incidentally, I had already tried setting the env variable $http_proxy
but that didn't seem to work.

That being said, I'm moving this discussion to the distutils-SIG
mailing list while I carry out some tests.

Thanks again.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: setuptools, ez_setup over http proxy

2005-10-14 Thread yoda
It appears that it was my proxy config that was flakey.  setup tools
works like a charm. :$

-- 
http://mail.python.org/mailman/listinfo/python-list


setuptools, ez_setup over http proxy

2005-10-13 Thread yoda
I've recently configured my network such that I use squid as a http
proxy.  I'd now like to be able to use setuptools and ez_setup via this
proxy.  Is this possible? If so, how do I do it?

The most that the setuptools documentation says is
(http://peak.telecommunity.com/DevCenter/setuptools):

If you are behind an NTLM-based firewall that prevents Python
programs from accessing the net directly, you may wish to first install
and use the APS proxy server, which lets you get past such firewalls in
the same way that your web browser(s) do.

ps. I'm not sure that this is the right forum for this question.  If it
isn't feel free to point me to the right place.

-- 
http://mail.python.org/mailman/listinfo/python-list


Automating, Building, Testing and Deploying to Production Server

2005-10-02 Thread yoda
Hi Guys,
I've been used to deploying code to the production server by checking
out of subversion and manually sorting out any kinks. (yes, I know, it
sounds primitive)

I realize I'm losing so much time I could spend more productively. I'd
therefore like to know the different approaches you guys employ to
deploy builds from your staging servers (or laptops:) to the production
server in an automated repeatable safe manner.

How do you automate the process?
What tools do you use and how?
What documentation is available for the various tools?
What is the best, easiest, most automated, method that provides robust
versioning and easy rollback?

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: 1 Million users.. I can't Scale!!

2005-09-29 Thread yoda
1. How are you transmitting your SMSs?
Currently, a number of different gateways are being used: 2 provide a
SOAP web service interface, 1 other provides a REST based web service.

A transaction using the SOAP web services takes 3-5 seconds to complete
(from the point of calling the method to receive an error\success
confirmation)
The REST web service transaction takes 1 second or less to complete.

 2. If you disable the actual transmission, how many SMSs can your
application generate per second?
Currently, the content is generated and a number of SMS per user are
generated. I'll have to measure this more accurately but a cursory
glance indicated that we're generting approximately 1000 sms per
second. (I'm sure this can't be right.. the parser\generator should be
faster than that:)

Additionally, I've just confirmed that the gateway's we use can pump
out 20-100 sms's per second. This is currently too slow and we'll
probably get direct access to the mobile operator's SMSC which provides
larger throughput

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: 1 Million users.. I can't Scale!!

2005-09-29 Thread yoda
Thanks for the whitepapers and incredibly useful advice.  I'm beginning
to get a picture of what I should be thinking about and implementing to
achieve this kind of scalability.  Before I go down any particular
route here's a synopsis of the application.

1)User requests are received only during subscription.  We currently
don't have any problems with this because subscription rates increase
along a  sigmoid curve.

2)Once a user subscribes we begin to send them content as it becomes
available.

3)The content is sports data. Content generation is dependent on the
day. On days when there's a lot of action, we can generate up to 20
separate items in a second every 10 minutes.

4)The content is event driven e.g. a goal is scored. It is therefore
imperative that we send the content to the subscribers within a period
of 5 minutes or less.

There is a difference between one million users each who make one request once 
a month, and one million users who are each hammering the system with ten 
requests a second. Number of users on its own is a meaningless indicator of 
requirements.
Quite true and this lack of clarity was a mistake on my part.  Requests
from users do not really become a significant part of this equation
because, as described above, once a user subscribes the onus is upon us
to generate messages throughout a given period determined by the number
of updates a user has subscribed to receive.

5)Currently, hardware is a constraint (we're a startup and can't afford
high end servers). I would prefer a solution that doesn't have to
result in any changes to the hardware stack. For now, let's assume that
hardware is not part of the equation and every optimization has to be
software based. (except the beautiful network  optimizations suggested)

-- 
http://mail.python.org/mailman/listinfo/python-list


1 Million users.. I can't Scale!!

2005-09-28 Thread yoda
Hi guys,
My situation is as follows:

1)I've developed a service that generates content for a mobile service.
2)The content is sent through an SMS gateway (currently we only send
text messages).
3)I've got a million users (and climbing).
4)The users need to get the data a minimum of 5 seconds after it's
generated. (not considering any bottlenecks external to my code).
5)Generating the content takes 1 second.

I'm considering moving to stackless python so that I can make use of
continuations so that I can open a massive number of connections to the
gateway and pump the messages out to each user simultaneously.(I'm
thinking of 1 connection per user).

My questions therefore are:
1)Should I switch to stackless python or should I carry out experiments
with mutlithreading the application?
2)What architectural suggestions can you give me?
3)Has anyone encountered such a situation before? How did you deal with
it?
4)Lastly, and probably most controversial: Is python the right language
for this? I really don't want to switch to Lisp, Icon or Erlang as yet.

I really need help because my application currently can't scale. Some
user's end up getting their data 30 seconds after generation(best case)
and up to 5 minutes after content generation.  This is simply
unacceptable.  The subscribers deserve much better service if my
startup is to survive in the market.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Chronological Processing of Files

2005-09-26 Thread yoda
I've tried using the path module and it works like a *charm*.. plus my
code is cleaner and clearer.. :)

The list comprehension using os.stat() works well though I had to call
an additional  reverse() on the resultant list so that I could get the
list in order of newest first.

So,  in conclusion, I'll use the path module.

Thanks again guys. You've been a great help.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Chronological Processing of Files

2005-09-22 Thread yoda
Just to clarify:

Newest== modified last

The processing\sorting should apply to all the files found recursively
during the entire walk.

That being said, thanks for all the responses. I'll test the code
shortly and get back to everyone.

ps. This is why comp.lang.python is truly the greatest list ever.
(better than comp.lang.lisp?) Everyone is so helpful. Thanks again guys.

-- 
http://mail.python.org/mailman/listinfo/python-list


Chronological Processing of Files

2005-09-21 Thread yoda
This feels like a stupid question but I'll ask it anyway.

How can I process files chronologically (newest last) when using
os.walk()?

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Language Work Benches in Py

2005-09-06 Thread yoda
I realize that I forgot to post the sample code:). Below is my
implementation:

#DSL data
#123456789012345678901234567890123456789012345678901234567890,

dsldata=(
'SVCLFOWLER 10101MS0120050313.',
'SVCLHOHPE  10201DX0320050315',
'SVCLTWO   x10301MRP220050329..',
'USGE10301TWO  x50214..7050329...')

#Class mappings
Mappings={'svcl':{
(4,18):'CustomerName',
 (19,23):'CustomerID',
 (24,27) :'CallTypeCode',
 (28,35) : 'DateOfCallString'},
 'usge':{(4,8) :'CustomerID',
  (9,22):'CustomerName',
  (30,30):'Cycle',
  (31,36): 'ReadDate'}}


def generateClass(data):
'generate the class and instance with attributes'

className = data[:4].lower() #1)
mappingData= Mappings[className]#2)
class Klass:pass   #3)
Klass. __name__=className  #4)
#print Klass

for key in mappingData.keys():   #5)
fielddata=data[key[0]:key[1]]
print 'actual data-',fielddata
setattr(Klass,mappingData[key],fielddata) #6)
return Klass


def parseData():
'parse the data and generate a list of objects'
classes= [generateClass(item) for item in dsldata]
return classes

def printKlassData(Klass):
print Klass
for key in Klass.__dict__:
print ('attr-%s value-%s')%(key,Klass.__dict__[key])

if __name__=='__main__':
for Klass in parseData():
printKlassData (Klass)

-- 
http://mail.python.org/mailman/listinfo/python-list


Language Work Benches in Py

2005-09-03 Thread yoda
Hi,
I recently read Martin Fowler's article on language workbenches and
domain specific
languages(http://www.martinfowler.com/articles/languageWorkbench.html).
I then had the pleasure of reading Rainer Jowsig's implementation of
the sample in Lisp(http://lispm.dyndns.org/news?ID=NEWS-2005-07-08-1).

The lisp code was so sexy that I was inspired to write a sample in
Python. I'm relatively new to coding in Python so I'd love any barbs,
comments or criticisms about the code. You can find my article here :
(http://billionairebusinessman.blogspot.com/2005/09/drop-that-schema-and-put-your-hands-in.html).

I also made a screen cast of the same
(http://openenterpriseafrica.com/neo/blogs/010905/dsl-in-python.wmv.bz2).
Unfortunately, I had to make it using a windows machine so it's encoded
as wmv. (If anyone finds it useful and is inspired to encode it in a
more palatable format e.g. mov, I'd be honoured to create a torrent and
host it)

-- 
http://mail.python.org/mailman/listinfo/python-list


Py: a very dangerous language

2005-08-01 Thread yoda
It was 2a.m I was writing my first enterprise scale application in
Python the logic just flowed from my mind onto the keyboard and was
congealed into the most beautiful terse lines of code I had ever
seen...

It was 3a.m I knew I had to sleep work the next day or rather,
in a few hours but Python somehow brought out all the logic. All
the verbosity of my thought was purified into clean beautiful logic...

The Python was wringing the cruft out of my thought and letting me
generate wonderfully clean code I have to sleep... this is the last
line of code i'm writing...

It was 4a.m just one more def... then I'll sleep..

It was 5 a.m just one more class...I'll sleep now.. I've got to go
to work in a few hours

It was 6 a.m just one more lambda...I'll really sleep
now...seriously... I've got to go to work in a few hours

Python is a very dangerous language... It is addictive.. Once you start
coding, you simply can't stop No language has every made(allowed?)
me to think so clearly before This is madness.. I hardly every
sleep... I simply can't stop coding when I use Python

This is scary... maybe I should switch back to Java: a language so
unwieldy that I'm driven away from the keyboard in disgusted
frustration

I need to sleep.. but Python won't let me... Python is a dangerous
language

-- 
http://mail.python.org/mailman/listinfo/python-list


Standard Threads vs Weightless Threads

2005-08-01 Thread yoda
Recently I read Charming Python: Implementing Weightless Threads
(http://www-128.ibm.com/developerworks/linux/library/l-pythrd.html) by
David D.

I'm not an authority on threading architectures so I'd like to ask the
following:

1)What is the difference (in terms of performance, scalability,[insert
relevant metric here]) between microthreads and system threads?

2)If microthreads really are superior then why aren't they the standard
Python implementation (or at least within the standard library)? (where
my assumption is that they are not the standard implementation and are
not contained within the standard library).

ps. I fear that my questions and assumptions about threading may be
rather naive. If they are, it's because I haven't yet done any
significant research.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Listing Processes Running on Remote Machines

2005-08-01 Thread yoda
Thanks. I'll definitely look into it. I actually got distracted while
investigating Pyro(http://pyro.sourceforge.net/) to see if I could
achieve the same results... It seems like a lighter weight more
accessible solution than STAF (at least for my immediate needs)..

I'll update you with my progress...

-- 
http://mail.python.org/mailman/listinfo/python-list


Listing Processes Running on Remote Machines

2005-07-21 Thread yoda
Hello Hackers,
I'm developing a large scale distributed service and part of the
requirement is that I be able to monitor clients in a very granular
way.

To this end, I'd like to know if there is any way to get a list of all
the processes running on a remote client\machine.  I need to be able to
do this on demand. (i.e on user demand)

Please note that the clients run heterogeneous operating systems mainly
Linux and Windows2000\XP

-- 
http://mail.python.org/mailman/listinfo/python-list