[Tutor] ftp transfer - queue & retry pattern
Hello, I want to write a module that transfers a given file to a given destination with a given protocol (ftp, sftp, scp) to start out, just ftp would be sufficient the destination machines are sometimes unavailable, or the transfer fails sometimes due to connection problems I have several questions (sorry): 1) python ftplib or curl via commands.getoutput(...)? what are the benefits / drawbacks (portability is not an issue for me, and I like curl a lot) 2) Is there a simple way to create a subprocess(?) that I can dispatch my transfer jobs to from my main script. The goal is that my main script can proceed without beeing hung by the transfer process 3) that subprocess would need a queue, that collects all transfer jobs. can you give me some hints on how that could be done in python 4) the subprocess (?) would try to transfer all jobs one after another if one transfer fails, retry after a certain time period (say 2, 5, 10, 20, 40 min) and stop after maxnum retries how could I code this nicely in python without resorting to if then else etc? or is there a module that does 1)-4) out there somewhere already? thanks for any insight you might have -frank -- 10 GB Mailbox, 100 FreeSMS/Monat http://www.gmx.net/de/go/topmail +++ GMX - die erste Adresse für Mail, Message, More +++ ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] python watchfolder as daemon
Hello all, I found this gem of a python recipe that allows me to monitor a hierarchy of folders on my filesystem for additions / changes / removals of files: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/215418 I would like to monitor a folder hierarchy using this script but now I wonder what would be the best way to have this watchfolder script running at all times in the background? nohup python watchfolder.py myfolder & ? would that be suitable?? or is there a way to launch a process like that from python that detaches itself from the shell? what would be the best way, any insight or recommendation is very much appreciated thanks, -frank -- Lust, ein paar Euro nebenbei zu verdienen? Ohne Kosten, ohne Risiko! Satte Provisionen für GMX Partner: http://www.gmx.net/de/go/partner ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] handling of tabular data
Hello I often find myself writing python programs to compute averages, min, max, top10 etc of columns in a table of data In these programs, I always capture each row of the table in a tuple the table is then represented by a list of tuples computing averages, min, max and other meta-information is then done with for loops or some list comprehension. now I wonder, whether I shouldn't be using classes instead of lists to capture the table rows with the little I know about classes, I assume that then I would have a list of class instances as representation of my tabular data but given such a list of class instances, i would still need for loops to get to e.g. the minimal value of a certain attribute in all classes in that list. OR? and what would be the benefit of using classes then? what is the best practice, can anyone shed some light on this thanks -frank ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor
[Tutor] checking substrings in strings
Hello, I would like to check if a certain word exists in a given string. since I learned to love lists, I used if myword in mystring: ...didnt work. I have now resorted to if mystring.find(myword) >1: is this really the canonical way to check if myword exists in mystring? it feels somewhat "unpythonic", thats why I'm asking thanks for any insight you might have -frank ___ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor