Re: Obtaining user information
On 10/12/11 02:44:48, Tim Chase wrote: Currently I can get the currently-logged-in-userid via getpass.getuser() which would yield something like "tchase". Is there a cross-platform way to get the full username (such as from the GECOS field of /etc/passed or via something like NetUserGetInfo on Win32 so I'd get "Tim Chase" instead? How about: pwd.getpwuid(os.getuid()).pw_gecos This will give you the GECOS field of /etc/passed. I'd assume it contains "Tim Chase" for your account. Hope this helps, -- HansM -- http://mail.python.org/mailman/listinfo/python-list
Re: Obtaining user information
On 09Dec2011 19:44, Tim Chase wrote: | Currently I can get the currently-logged-in-userid via | getpass.getuser() which would yield something like "tchase". _If_ you're on a terminal. _And_ that's exactly what you want. Personally I need to the name of geteuid() or getuid() more often. | Is there a cross-platform way to get the full username (such as from | the GECOS field of /etc/passed or via something like NetUserGetInfo | on Win32 so I'd get "Tim Chase" instead? Hmm. Doesn't windows have a posix layer? pwd.getpwnam(os.getuid())[4].split(',')[0] is the best I've got. ANd it probably doesn't work in Windows:-( -- Cameron Simpson DoD#743 http://www.cskk.ezoshosting.com/cs/ There's not a woman in his book, the plot hinges on unkindness to animals, and the black characters mostly drown by chapter 29. - P J O'Rourke parodying a PC review of Moby Dick -- http://mail.python.org/mailman/listinfo/python-list
Re: How to move scrollbar by code?
On Sat, Dec 10, 2011 at 4:05 PM, Muddy Coder wrote: > I am trying to make a listbox that will contain a looong data list, > sorted, so I will be able to pre-select a data line by coding. I have > done it. Which GUI toolkit are you using? What you want is not the Python language docs, but the docs for that toolkit (GTK, Qt, Tk, etc). ChrisA -- http://mail.python.org/mailman/listinfo/python-list
How to move scrollbar by code?
Hi Folks, I am trying to make a listbox that will contain a looong data list, sorted, so I will be able to pre-select a data line by coding. I have done it. Say my listbox contains 1000 data lines, and my program has figured out the data line 321 is needed, so just put the cursor on data line 321. However, my scrollbar is still seating on the top, so I just can view the data line from line 0 to 30 or 40, since I can't see the desired data line 321. I still need to manually pull the scrollbar down to display the data line 321. What I want to do is to grab the adjacent data lines, say from line 300 to 340, and display this lines, while my cursor seating in the middle. I consulted the Python Docs, but did not find such details. Can somebody give an idea? Thanks! Cosmo -- http://mail.python.org/mailman/listinfo/python-list
Re: Multiprocessing bug, is information ever omitted from a traceback?
On 12/9/2011 6:14 PM, John Ladasky wrote: http://groups.google.com/group/comp.lang.python/browse_frm/thread/751b7050c756c995# I'm programming in Python 2.6 on Ubuntu Linux 10.10, if it matters. It might, as many bugs have been fixed since. Can you try the same code with the most recent 2.x release, 2.7.2? Do you have working and non-working code that you can publicly release? Can you reduce the size and dependencies so the examples are closer to 'small' than 'large'? And in any case, self-contained? In my first response, I said you might have found a bug. A bogus exception message qualifies. But to do much, we need minimal good/bad examples that run or not on a current release (2.7.2 or 3.2.2). -- Terry Jan Reedy -- http://mail.python.org/mailman/listinfo/python-list
Re: Buffering of sys.stdout and sys.stderr in python3 (and documentation)
On 12/9/2011 2:32 PM, Geoff Bache wrote: Hi all, Short version: I'm a bit confused in general as to the changes between python2 and python3 regarding how standard output and standard error do buffering. A few things seem to have changed and I've failed to find any documentation of how and why. Also, the meaning of "python -u" seems to have changed and the docs don't seem to reflect the new behaviour (and I can't find any docs about the change either)... Long version: From rude experiment it seems that: 1) In Python 2.x, standard error was always unbuffered while standard output was buffered by default. In python3, both are buffered. In both cases, "buffered" means line-buffered when writing to the console and not line-buffered when redirected to files. 2) In Python 2.x, the "-u" flag meant everything was totally unbuffered. In Python 3.x, it means that both stdout and stderr are line-buffered also when redirected to files. Are either of these changes documented anywhere? (1) seems important : it can lead to not seeing exception printouts, if stderr is redirected to a file and the program is subsequently terminated with SIGTERM. I just wasted quite a bit of time due to this situation... This is what the Python 3 docs have to say about the -u flag: "Force the binary layer of the stdin, stdout and stderr streams (which is available as their buffer attribute) to be unbuffered. The text I/O layer will still be line-buffered." The "still" seems misleading to me, as it is only relevant if writing to the console. It would be useful to contrast the behaviour with and without "-u" when writing to files I would say. The difference from 2.x should be in What's New in 3.0, except that the new i/o module is in 2.6, so it was not exactly new. You might be able to find more in http://python.org/dev/peps/pep-3116/ You *should* be able to find sufficient info in the 3.x docs. If, after you get other responses (or not), you think the docs need upgrading, open an issue on the tracker at bugs.python.org with suggestions as specific as possible, including changed or new lines of text based on your experience and experiments. -- Terry Jan Reedy -- http://mail.python.org/mailman/listinfo/python-list
Re: How to build 64-bit Python on Solaris with GCC?
> ./configure CFLAGS=-m64 LDFLAGS=-m64 should work with a reasonably > recent revision. Thanks, that did, indeed work with CPython trunk. I eventually switched from gcc to Sun's compiler though because I was getting link warnings. Skip -- http://mail.python.org/mailman/listinfo/python-list
Obtaining user information
Currently I can get the currently-logged-in-userid via getpass.getuser() which would yield something like "tchase". Is there a cross-platform way to get the full username (such as from the GECOS field of /etc/passed or via something like NetUserGetInfo on Win32 so I'd get "Tim Chase" instead? Thanks, -tkc -- http://mail.python.org/mailman/listinfo/python-list
Re: tracking variable value changes
On Thu, 08 Dec 2011 12:17:11 -0800, Catherine Moroney wrote: > Hello, > > Is there a way to create a C-style pointer in (pure) Python so the > following code will reflect the changes to the variable "a" in the > dictionary "x"? Strictly speaking, no, but there may be a way to get something close. See below. > For example: > > >>> a = 1.0 > >>> b = 2.0 > >>> x = {"a":a, "b":b} > >>> x > {'a': 1.0, 'b': 2.0} > >>> a = 100.0 > >>> x > {'a': 1.0, 'b': 2.0} ## at this point, I would like the value > ## associated with the "a" key to be 100.0 ## > rather than 1.0 The line "a = 100" is a rebinding, and so what you are asking for isn't directly possible. But if you are willing to live with an explicit redirection, you can somewhat simulate a pointer with a list: py> aPtr = [1.0] # not really a pointer, but let's pretend it is py> bPtr = [2.0] py> x = {'a': aPtr, 'b': bPtr} py> x {'a': [1.0], 'b': [2.0]} py> aPtr[0] = 100.0 py> x {'a': [100.0], 'b': [2.0]} If you prefer, you can write a simple class to handle the redirection with the interface of your choice. Something like this might be a good start: class SharedValue: def set(self, value): self.value = value def get(self): return self.value def __repr__(self): # Somewhat dubious. return str(self.value) py> a = SharedValue() # create a pointer py> a.set(1.0) py> x = {'a': a} py> x {'a': 1.0} py> a.set(100.0) py> x {'a': 100.0} Look at the automatic delegation pattern for a way to have operations on "a" automatically apply to the object being pointed to. (This will be *much* simpler if you don't inherit from object.) But be warned, whatever you do, rebinding will behave in the standard Python way. E.g.: py> aPtr = [2000.0] # Oops, rebound the name to something else! py> x # and the connection is lost {'a': [100.0], 'b': [2.0]} > If I make "a" and "b" numpy arrays, then changes that I make to the > values of a and b show up in the dictionary x. Yes, because numpy arrays are mutable objects. In this case, you have two (or more) references to a single object: the name "a", and the entry in dict "x". When you modify the object in either place, the change is visible in both places because they are the same object. But when you rebind the name "a" to another object -- not necessarily a *new* object, just a different one -- there is no way for the dict "x" to notice this change and follow along. > My understanding is that when I redefine the value of "a", that Python > is creating a brand-new float with the value of 100.0, whereas when I > use numpy arrays I am merely assigning a new value to the same object. Correct. Although the float need not be brand-new. Python could (but probably doesn't) re-use an existing float object. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: I love the decorator in Python!!!
On Saturday, December 10, 2011 2:28:49 AM UTC+8, 8 Dihedral wrote: > On Thursday, December 8, 2011 7:43:12 PM UTC+8, Chris Angelico wrote: > > On Thu, Dec 8, 2011 at 10:22 PM, K.-Michael Aye wrote: > > > I am still perplexed about decorators though, am happily using Python for > > > many years without them, but maybe i am missing something? > > > For example in the above case, if I want the names attached to each other > > > with a comma, why wouldn't I just create a function doing exactly this? > > > Why > > > would I first write a single name generator and then decorate it so that I > > > never can get single names anymore (this is the case, isn't it? Once > > > decorated, I can not get the original behaviour of the function anymore. > > > > The example given is a toy. It's hardly useful. However, there are a > > number of handy uses for decorators; mostly, they consist of giving a > > single simple keyword to a complicated set of logic. One example is > > the @classmethod and @staticmethod decorators - the code to implement > > them could be uglier than nested inline assembly, but you don't have > > to care, because you just type "@staticmethod" in front of your def > > statement and it does its magic. > > > > Here's a handy trick that I'm sure someone has done in a more sophisticated > > way: > > > > def trace(func): > > if debugmode: > > return lambda *a,**ka: > > (print(">"+func.__name__),func(*a,**ka),print("<"+func.__name__))[1] > > return func > > > > Then you put @trace in front of all your functions, and if debugmode > > is False, nothing will be done - but set it to true, and you get > > console output at the entry and exit of each function. > > > > >>> @trace > > def test(x): > > print("Test! "+x) > > return 5 > > > > >>> test("asdf") > > >test > > Test! asdf > > > 5 > > > > Again, it's helpful because it condenses all the logic (including the > > 'debugmode' flag) down to a single high level directive: "Trace this > > function". > > > > ChrisA > > I did use decorators to turn functions into iterables to be traced. It is easy to use decorators in python to mimic those programs in Erlang. -- http://mail.python.org/mailman/listinfo/python-list
Re: Python 2 or 3
On Sat, 10 Dec 2011 00:16:30 +0100, Enrico 'Henryx' Bianchi wrote: > -BEGIN PGP SIGNED MESSAGE- > Hash: SHA1 > > Tobiah wrote: > >> Use the newer version and don't look back. > > Interesting reply, but if I have a platform wich doesn't support Python > 3 (e.g. RHEL 5.x)? ]:) RHEL supports Python 3, it just doesn't provide Python 3. It almost certainly will work if you install from source. I haven't tried it on RHEL myself, but I have done so on Fedora and Centos, and there's no problem. But be warned that you should not replace the system python with Python 3. When installing, don't use "make install", as that will replace the system Python, instead use "make altinstall". Then the command "python" will still refer to the system Python (probably Python 2.4 or 2.5?), and "python3" should refer to Python 3.x. > Enrico > P.S. note that: I *don't* want to recompile Python in production > environment You shouldn't be learning programming on a production server :) -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: Python 2 or 3
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Tobiah wrote: > Use the newer version and don't look back. Interesting reply, but if I have a platform wich doesn't support Python 3 (e.g. RHEL 5.x)? ]:) Enrico P.S. note that: I *don't* want to recompile Python in production environment -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.11 (GNU/Linux) iQEcBAEBAgAGBQJO4pbOAAoJED3SMOGZLYdYQ7sIAI3vfvOyQc5Gx205cDMS7bPK uXxZI7ShqybyEv0NMDapxURQhz59Kc9zh8E/OKDiXohjmkE1YA78K7qSKyrtXTMy ppcGUU5USaQhPZ+RqOEj95aTxQj3CW/8w74rNEirIMn6+yGt4QjWRuGT1K6aUM51 BXF9I22f37z/sJ7x+fZUL9R7G1HA4saRGEiQGxBgkmt6gi28nboOibdxfw9bmP5x aHbpVYQ6yo+7nOf0XZno/pl0zkpDvhS/tNvvuH8kYQIvMLyQZ/f+xZJ6yj58S5Se AGSGXEDRemw0Ge83HjJvmQE3JXjy1fc1gCQSnmqQifXW7h18q99L3okJds+uHnE= =PwK5 -END PGP SIGNATURE- -- http://mail.python.org/mailman/listinfo/python-list
Multiprocessing bug, is information ever omitted from a traceback?
Hi folks, A tangent off of this thread: http://groups.google.com/group/comp.lang.python/browse_frm/thread/751b7050c756c995# I'm programming in Python 2.6 on Ubuntu Linux 10.10, if it matters. I'm trying to track down a multiprocessing bug. Here's my traceback. All lines of code referenced in the traceback are in the standard library code: Exception in thread Thread-1: Traceback (most recent call last): File "/usr/lib/python2.6/threading.py", line 532, in __bootstrap_inner self.run() File "/usr/lib/python2.6/threading.py", line 484, in run self.__target(*self.__args, **self.__kwargs) File "/usr/lib/python2.6/multiprocessing/pool.py", line 284, in _handle_tasks put(task) TypeError: expected string or Unicode object, NoneType found Fortunately, I have a working version of my code. I was trying to add new features, and only my new code is causing trouble. This has allowed me to examine the contexts of task when everything works. Task is not a string when the program works. Task is not None when the program doesn't work. In fact, task is a deeply-nested tuple. NO PART of this tuple ever contains any strings, as far as I can tell. More details in my original thread. Now, of course I've seen that the standard traceback shows you the lines where various steps in a chain of function calls were taken. The traceback skips over any lines in the code between successive function calls, and assumes that you can follow along. No problem, I can do that. But when multiprocessing is involved, can this traceback be truncated in some way, for example when code execution switches over to a subprocess? I'm wondering if more code is getting executed after "put(task)" that I'm not seeing. Thanks for any information! -- http://mail.python.org/mailman/listinfo/python-list
Re: Contacts/Addressbook application - any good Python ones out there?
tinn...@isbd.co.uk wrote: > Alec Taylor wrote: > > Wammu? > > > I hadn't really considered gammu/wammu as I saw it as a mobile phone > synchrinsation tool, but I've looked a bit harder and it might very > well be what I need - thank you! > Well one problem with wammu is that you can't do anything with the program unless there's a phone connected. -- Chris Green -- http://mail.python.org/mailman/listinfo/python-list
Re: Dynamic variable creation from string
Massi wrote: Thank you all for your replies, first of all my Sum function was an example simplifying what I have to do in my real funciton. In general the D dictionary is complex, with a lot of keys, so I was searching for a quick method to access all the variables in it without doing the explicit creation: a, b, c = D['a'], D['b'], D['c'] and without using directly the D dictionary (boring...). When I talked about nested function I meant both cases Chris, but this is not a really tighten bound. I tried to follow the hints of Chris together with some help by google and used the following code: for k in D : exec "%s = D[k]" %k That seems to do the trick, but someone speaks about "dirty code", can anyone point me out which problems this can generate? Again, thank you for your help! Besides the serious security issues, this method won't make the problem any better in Python 3. ~Ethan~ -- http://mail.python.org/mailman/listinfo/python-list
Buffering of sys.stdout and sys.stderr in python3 (and documentation)
Hi all, Short version: I'm a bit confused in general as to the changes between python2 and python3 regarding how standard output and standard error do buffering. A few things seem to have changed and I've failed to find any documentation of how and why. Also, the meaning of "python -u" seems to have changed and the docs don't seem to reflect the new behaviour (and I can't find any docs about the change either)... Long version: >From rude experiment it seems that: 1) In Python 2.x, standard error was always unbuffered while standard output was buffered by default. In python3, both are buffered. In both cases, "buffered" means line-buffered when writing to the console and not line-buffered when redirected to files. 2) In Python 2.x, the "-u" flag meant everything was totally unbuffered. In Python 3.x, it means that both stdout and stderr are line-buffered also when redirected to files. Are either of these changes documented anywhere? (1) seems important : it can lead to not seeing exception printouts, if stderr is redirected to a file and the program is subsequently terminated with SIGTERM. I just wasted quite a bit of time due to this situation... This is what the Python 3 docs have to say about the -u flag: "Force the binary layer of the stdin, stdout and stderr streams (which is available as their buffer attribute) to be unbuffered. The text I/O layer will still be line-buffered." The "still" seems misleading to me, as it is only relevant if writing to the console. It would be useful to contrast the behaviour with and without "-u" when writing to files I would say. Regards, Geoff Bache -- http://mail.python.org/mailman/listinfo/python-list
Re: I love the decorator in Python!!!
On Thursday, December 8, 2011 7:43:12 PM UTC+8, Chris Angelico wrote: > On Thu, Dec 8, 2011 at 10:22 PM, K.-Michael Aye wrote: > > I am still perplexed about decorators though, am happily using Python for > > many years without them, but maybe i am missing something? > > For example in the above case, if I want the names attached to each other > > with a comma, why wouldn't I just create a function doing exactly this? Why > > would I first write a single name generator and then decorate it so that I > > never can get single names anymore (this is the case, isn't it? Once > > decorated, I can not get the original behaviour of the function anymore. > > The example given is a toy. It's hardly useful. However, there are a > number of handy uses for decorators; mostly, they consist of giving a > single simple keyword to a complicated set of logic. One example is > the @classmethod and @staticmethod decorators - the code to implement > them could be uglier than nested inline assembly, but you don't have > to care, because you just type "@staticmethod" in front of your def > statement and it does its magic. > > Here's a handy trick that I'm sure someone has done in a more sophisticated > way: > > def trace(func): > if debugmode: > return lambda *a,**ka: > (print(">"+func.__name__),func(*a,**ka),print("<"+func.__name__))[1] > return func > > Then you put @trace in front of all your functions, and if debugmode > is False, nothing will be done - but set it to true, and you get > console output at the entry and exit of each function. > > >>> @trace > def test(x): > print("Test! "+x) > return 5 > > >>> test("asdf") > >test > Test! asdf > 5 > > Again, it's helpful because it condenses all the logic (including the > 'debugmode' flag) down to a single high level directive: "Trace this > function". > > ChrisA I did use decorators to turn functions into iterables to be traced. -- http://mail.python.org/mailman/listinfo/python-list
Re: Misleading error message of the day
Jean-Michel Pichavant wrote: Ethan Furman wrote: Jean-Michel Pichavant wrote: You have to opportunity to not use unpacking anymore :o) There is a recent thread were the dark side of unpacking was exposed. Unpacking is a cool feautre for very small applications but should be avoided whenever possible otherwise. Which thread was that? ~Ethan~ "A tuple in order to pass returned values ?" was the thread. Thanks. ~Ethan~ -- http://mail.python.org/mailman/listinfo/python-list
Re: Execute python within Oracle
2011/12/9 André Lopes : > Hi all, > > > > I wrote a simple Java program to be called within an Oracle database. The > goal is to execute a Python program within the DB itself, by the means of a > Java program. The problem is that when I execute the procedure inside the > DB, nothing happens… > > > > If I create the same Java class outside the DB and execute it, the python > program works perfectly, only inside the DB nothing happens. The program is > the following. Have you granted the necessary permissions to execute programs from Java? http://asktom.oracle.com/pls/asktom/f?p=100:11:0P11_QUESTION_ID:952229840241 Note that article is for Oracle 8, I'm not sure whether the permissions might have changed since then. Cheers, Ian -- http://mail.python.org/mailman/listinfo/python-list
Re: order independent hash?
On Sat, Dec 10, 2011 at 3:51 AM, Hrvoje Niksic wrote: > The case of dicts which require frequent access, such as those used to > implement namespaces, is different, and more interesting. Those dicts > are typically quite small, and for them the difference between O(log n) > and O(1) is negligible in both theory (since n is "small", i.e. bounded) > and practice. In fact, depending on the details of the implementation, > the lookup in a small tree could even be marginally faster. This is something where, I am sure, far greater minds than mine delve... but, would a splay tree be effective for name lookups? In most cases, you'll have a huge puddle of names of which you use the tiniest fraction; and a splay tree would, in effect, automatically optimize itself to handle tight loops. ChrisA -- http://mail.python.org/mailman/listinfo/python-list
Execute python within Oracle
Hi all, I wrote a simple Java program to be called within an Oracle database. The goal is to execute a Python program within the DB itself, by the means of a Java program. The problem is that when I execute the procedure inside the DB, nothing happens… If I create the same Java class outside the DB and execute it, the python program works perfectly, only inside the DB nothing happens. The program is the following. CREATE OR REPLACE AND COMPILE java source named "OSCommand" as import java.io.*; public class OSCommand{ public static void Run(){ try { Runtime r = Runtime.getRuntime(); Process p = r.exec("cmd /c C:\\Python32\\python.exe C:\\Ficheiros\\SAP\\Novos\\xls2csv.py C:\\Ficheiros\\SAP\\Novos\\20111020_ListagemSAP.xlsx"); } catch (Exception e) { e.printStackTrace(); } } } / Can anyone help? Thanks, André -- http://mail.python.org/mailman/listinfo/python-list
Re: order independent hash?
Steven D'Aprano writes: > Except for people who needed dicts with tens of millions of items. Huge tree-based dicts would be somewhat slower than today's hash-based dicts, but they would be far from unusable. Trees are often used to organize large datasets for quick access. The case of dicts which require frequent access, such as those used to implement namespaces, is different, and more interesting. Those dicts are typically quite small, and for them the difference between O(log n) and O(1) is negligible in both theory (since n is "small", i.e. bounded) and practice. In fact, depending on the details of the implementation, the lookup in a small tree could even be marginally faster. -- http://mail.python.org/mailman/listinfo/python-list
Re: Contacts/Addressbook application - any good Python ones out there?
Alec Taylor wrote: > Wammu? > I hadn't really considered gammu/wammu as I saw it as a mobile phone synchrinsation tool, but I've looked a bit harder and it might very well be what I need - thank you! > On Sat, Dec 10, 2011 at 1:41 AM, wrote: > > I'm after an application for managing Contacts (i.e. an Address Book) > > and as I suspect I will want to 'tune' it a bit Python would be my > > preferred language. > > > > So far I have found :- > > > > pycocuma - reasonable but rather old and a bit clunky (uses TCL/Tk) > > pyaddressbook - newer but very minimal > > > > Does anyone have any other suggestions? I'd prefer an application > > which uses vCards as its native data storage format but that's not > > vital and I'd also like to have a GUI but again that's not vital, a > > well designed curses/terminal application would be OK too. > > > > -- > > Chris Green > > -- > > http://mail.python.org/mailman/listinfo/python-list -- Chris Green -- http://mail.python.org/mailman/listinfo/python-list
Re: How to build 64-bit Python on Solaris with GCC?
Skip Montanaro wrote: > Thanks. I have several different versions in my local sandbox. None > are 64-bit ELFs. Just to make sure I hadn't missed some new development > in this area, I cloned the hg repository and build the trunk version > from scratch. I get a 32-bit executable on Solaris: > >% file ./python >./python: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), >dynamically linked (uses shared libs), not stripped ./configure CFLAGS=-m64 LDFLAGS=-m64 should work with a reasonably recent revision. Stefan Krah -- http://mail.python.org/mailman/listinfo/python-list
Re: [OT] Book authoring
Nick Dokos wrote: > There is also orgmode, which has been used for a few books > (http://orgmode.org ). I know it does HTML and PDF (the latter through > latex), but I'm not sure about ePub: ISTR somebody actually did ePub for > his book but I don't remember details. Avdi Grimm produced his book "Exceptional Ruby" (http://exceptionalruby.com ) this way, including ePub formats (I hope mentioning Ruby in this context is not a punishable offense...) Apparently, there is calibre (http://calibre-ebook.com/ ) that will take you from HTML to ePub. See this orgmode list article e.g. http://thread.gmane.org/gmane.emacs.orgmode/41826 Nick -- http://mail.python.org/mailman/listinfo/python-list
Re: [OT] Book authoring
On 12/09/2011 03:25 AM, Miki Tebeka wrote: Greetings, Any recommendations for a book authoring system that supports the following: 1. Code examples (with syntax highlighting and line numbers) 2. Output HTML, PDF, ePub ... 3. Automatic TOC and index 4. Search (in HTML) - this is a "nice to have" Can I somehow use Sphinx? Thanks, -- Miki I think it depends on what you want exactly. If it's a nice book with a scientific look and many complicated tables/figures than I think that LaTeX is the way to go (maybe even org-mode but it's mainly for emacs-fans). The problem with LaTeX is that it's quite tricky to export to other formats, harder to learn and not as flexible as a python-based solution as Sphinx. I would suggest to try Sphinx and see if you're missing something.. -- http://mail.python.org/mailman/listinfo/python-list
Re: How to build 64-bit Python on Solaris with GCC?
Karim gmail.com> writes: > ./configure > make > make install Thanks. I have several different versions in my local sandbox. None are 64-bit ELFs. Just to make sure I hadn't missed some new development in this area, I cloned the hg repository and build the trunk version from scratch. I get a 32-bit executable on Solaris: % file ./python ./python: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), not stripped Skip -- http://mail.python.org/mailman/listinfo/python-list
Re: [OT] Book authoring
Grant Edwards wrote: > On 2011-12-09, Miki Tebeka wrote: > > Greetings, > > > > Any recommendations for a book authoring system that supports the following: > > 1. Code examples (with syntax highlighting and line numbers) > > 2. Output HTML, PDF, ePub ... > > 3. Automatic TOC and index > > 4. Search (in HTML) - this is a "nice to have" > > http://en.wikipedia.org/wiki/Lightweight_markup_language > > I've used asciidoc extensively and reStructuredText a little. Asciidoc > will produce all the formats you mentioned (though I've only refularly > used HTML and PDF). reStructuredText is what's used for Python docs > isn't it? > > > Can I somehow use Sphinx? > > Don't know what Sphinx is. > I think Sphinx is used for the python docs: it sits atop rST and does all the transformations/processing to produce the desired output ( http://sphinx.pocoo.org ) > And there's always the old stand-by LaTeX, but it's a bit more > heavyweight with more of a learning curve. OTOH, it does produce > text-book quality output. > There is also orgmode, which has been used for a few books (http://orgmode.org ). I know it does HTML and PDF (the latter through latex), but I'm not sure about ePub: ISTR somebody actually did ePub for his book but I don't remember details. The indexing is manual: add #+index: foo entries as required. But in general, imo, automatic indexing for books sucks raw eggs (it works much better for highly regular source code like the python source base). Nick -- http://mail.python.org/mailman/listinfo/python-list
Re: [OT] Book authoring
On 2011-12-09, Miki Tebeka wrote: > Greetings, > > Any recommendations for a book authoring system that supports the following: > 1. Code examples (with syntax highlighting and line numbers) > 2. Output HTML, PDF, ePub ... > 3. Automatic TOC and index > 4. Search (in HTML) - this is a "nice to have" http://en.wikipedia.org/wiki/Lightweight_markup_language I've used asciidoc extensively and reStructuredText a little. Asciidoc will produce all the formats you mentioned (though I've only refularly used HTML and PDF). reStructuredText is what's used for Python docs isn't it? > Can I somehow use Sphinx? Don't know what Sphinx is. And there's always the old stand-by LaTeX, but it's a bit more heavyweight with more of a learning curve. OTOH, it does produce text-book quality output. -- Grant Edwards grant.b.edwardsYow! BELA LUGOSI is my at co-pilot ... gmail.com -- http://mail.python.org/mailman/listinfo/python-list
Re: Contacts/Addressbook application - any good Python ones out there?
Wammu? On Sat, Dec 10, 2011 at 1:41 AM, wrote: > I'm after an application for managing Contacts (i.e. an Address Book) > and as I suspect I will want to 'tune' it a bit Python would be my > preferred language. > > So far I have found :- > > pycocuma - reasonable but rather old and a bit clunky (uses TCL/Tk) > pyaddressbook - newer but very minimal > > Does anyone have any other suggestions? I'd prefer an application > which uses vCards as its native data storage format but that's not > vital and I'd also like to have a GUI but again that's not vital, a > well designed curses/terminal application would be OK too. > > -- > Chris Green > -- > http://mail.python.org/mailman/listinfo/python-list -- http://mail.python.org/mailman/listinfo/python-list
Contacts/Addressbook application - any good Python ones out there?
I'm after an application for managing Contacts (i.e. an Address Book) and as I suspect I will want to 'tune' it a bit Python would be my preferred language. So far I have found :- pycocuma - reasonable but rather old and a bit clunky (uses TCL/Tk) pyaddressbook - newer but very minimal Does anyone have any other suggestions? I'd prefer an application which uses vCards as its native data storage format but that's not vital and I'd also like to have a GUI but again that's not vital, a well designed curses/terminal application would be OK too. -- Chris Green -- http://mail.python.org/mailman/listinfo/python-list
Re: Misleading error message of the day
In article , Jean-Michel Pichavant wrote: > Ethan Furman wrote: > > Jean-Michel Pichavant wrote: > >> You have to opportunity to not use unpacking anymore :o) There is a > >> recent thread were the dark side of unpacking was exposed. Unpacking > >> is a cool feautre for very small applications but should be avoided > >> whenever possible otherwise. > > > > Which thread was that? > > > > > > ~Ethan~ > "A tuple in order to pass returned values ?" was the thread. > > JM To save everybody the effort of finding it, I think he's talking about https://groups.google.com/d/topic/comp.lang.python/2vcwYfIQSOM/discussion -- http://mail.python.org/mailman/listinfo/python-list
Re: tracking variable value changes
On 12/08/2011 08:17 PM, Catherine Moroney wrote: Hello, Is there a way to create a C-style pointer in (pure) Python so the following code will reflect the changes to the variable "a" in the dictionary "x"? For example: >>> a = 1.0 >>> b = 2.0 >>> x = {"a":a, "b":b} >>> x {'a': 1.0, 'b': 2.0} >>> a = 100.0 >>> x {'a': 1.0, 'b': 2.0} ## at this point, I would like the value ## associated with the "a" key to be 100.0 ## rather than 1.0 If I make "a" and "b" numpy arrays, then changes that I make to the values of a and b show up in the dictionary x. My understanding is that when I redefine the value of "a", that Python is creating a brand-new float with the value of 100.0, whereas when I use numpy arrays I am merely assigning a new value to the same object. Is there some way to rewrite the code above so the change of "a" from 1.0 to 100.0 is reflected in the dictionary. I would like to use simple datatypes such as floats, rather than numpy arrays or classes. I tried using weakref's, but got the error that a weak reference cannot be created to a float. Catherine Not sure if it's exactly pure python but Traits can actually do this https://github.com/enthought/traits -- http://mail.python.org/mailman/listinfo/python-list
Re: I love the decorator in Python!!!
On 8 Dic, 12:22, K.-Michael Aye wrote: > On 2011-12-08 08:59:26 +, Thomas Rachel said: > > > > > Am 08.12.2011 08:18 schrieb 8 Dihedral: > >> I use the @ decorator to behave exactly like a c macro that > >> does have fewer side effects. > > >> I am wondering is there other interesting methods to do the > >> jobs in Python? > > > In combination with a generator, you can do many funny things. > > > For example, you can build up a string: > > > def mkstring(f): > > """Turns a string generator into a string, > > joining with ", ". > > """ > > return ", ".join(f()) > > > def create_answer(): > > @mkstring > > def people(): > > yield "Anna" > > yield "John" > > yield "Theo" > > > return "The following people were here: " + people > > > Many other things are thinkable... > > > Thomas > > I am still perplexed about decorators though, am happily using Python > for many years without them, but maybe i am missing something? > For example in the above case, if I want the names attached to each > other with a comma, why wouldn't I just create a function doing exactly > this? Why would I first write a single name generator and then decorate > it so that I never can get single names anymore (this is the case, > isn't it? Once decorated, I can not get the original behaviour of the > function anymore. > So, above, why not > def mkstring(mylist): > with the same function declaration and then just call it with a list of > names that I generate elsewhere in my program? > I just can't identify the use-case for decorators, but as I said, maybe > I am missing something. > > Michael I had/have similar feelings. For instance, this is something that I tought useful, but then I never used in real code. The idea was to find a way to automate this code pattern, which I do a lot: class SomeClass: def __init__(self, some, attribute, here ): self.some, self.attribute, self.here = some, attribute, here In other words, I often define classes in which the constructor list of arguments corresponds one-to-one to class attributes. So I thought of this (it uses class decorators so it only works with Python 3.x ) : class FieldsDecorator: def __init__(self, *names): self.names = names def __call__(self, cls): def constructor(instance, **kwds): for n,v in kwds.items(): if n in self.names: setattr(instance, n, v) else: raise TypeError("%s is not a valid field" % s ) setattr(cls, '__init__', constructor ) return cls @FieldsDecorator("uno", "due") class Prova: pass p = Prova(uno=12, due=9) print (p.uno, p.due ) It works and it is nice, but I don't find it compelling enough to use it. I keep assigning directly the attributes, which is more readable. Decorators are really useful when you have lot of repetitive boilercode that you _want_ to hide, since it has little to do with the problem logic and more to to with the technicalities of the programming language or of some framework that you are using. It is called "separating of concerns" I think, and is one of the principles of Aspect-Oriented Programming (and with decorators you can do some nice AOP exercises ... ). Ciao --- FB -- http://mail.python.org/mailman/listinfo/python-list
Re: Dynamic variable creation from string
On Fri, Dec 9, 2011 at 10:59 PM, Steven D'Aprano wrote: > (4) If you think you can make exec safe with a prohibited list of > dangerous strings, you probably can't. If you think that it's even _possible_ to make exec safe with a blacklist, I have a nice padded cell for you over here. Security is NEVER achieved with blacklists, ONLY whitelists. ChrisA -- http://mail.python.org/mailman/listinfo/python-list
Re: Dynamic variable creation from string
On Fri, 09 Dec 2011 11:59:16 +, Steven D'Aprano wrote: > Just the second-most common source of viruses, malware and security > vulnerabilities (behind buffer overflows): code injection attacks. Oops, I forgot to go back and revise this sentence. Code injection attacks are now the most common, not second-most common, source of security vulnerabilities. http://cwe.mitre.org/top25/index.html -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: Dynamic variable creation from string
On Fri, 09 Dec 2011 01:55:28 -0800, Massi wrote: > for k in D : exec "%s = D[k]" %k > > That seems to do the trick, but someone speaks about "dirty code", can > anyone point me out which problems this can generate? Again, thank you > for your help! Just the second-most common source of viruses, malware and security vulnerabilities (behind buffer overflows): code injection attacks. Code injection attacks make up at least three of the top 25 security vulnerabilities on the CWE/SANS list: http://cwe.mitre.org/top25/index.html including the top 2 most dangerous threats (beating even our old friend, the buffer overflow): SQL injection and OS command injection. Your use of exec is vulnerable to attack if a hostile user can fool you into using a dict like this one: D = {'a': '42', 'import os;'\ ' os.system("""echo "ha ha i ownz ur system rm-rf/" """); b': '23', } for k in D : exec "%s = D[k]" % k You might think you're safe from such attacks, but (1) it is MUCH harder to protect against them than you might think; and (2) code has a habit of being re-used. Today your application might only be used by you; next week your code might find itself embedded in a web-application where hostile script kiddies can destroy your server with a single upload. My advice is: (1) If you need to ask why exec is dangerous, you shouldn't touch it. (2) If you're sure you can protect against code injection, you can't. (3) If you think you need exec, you probably don't. (4) If you think you can make exec safe with a prohibited list of dangerous strings, you probably can't. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: order independent hash?
On Thu, 08 Dec 2011 10:30:01 +0100, Hrvoje Niksic wrote: > In a language like Python, the difference between O(1) and O(log n) is > not the primary reason why programmers use dict; they use it because > it's built-in, efficient compared to alternatives, and convenient to > use. If Python dict had been originally implemented as a tree, I'm sure > it would be just as popular. Except for people who needed dicts with tens of millions of items. Remember also that dicts are used for looking up names in Python. Nearly all method calls, attribute accesses, global name lookups, function calls, etc. go through at least one and potentially multiple dict lookups. The simple statement: n = len(x.y) + len(z) likely requires nine dict lookups, and potentially more. In even a small application, there could be tens of millions of dict lookups; changing each of them from O(1) to O(log N) could result in a measurable slowdown to Python code in real applications. That is why dicts are highly optimized for speed. As fast as dicts are, sometimes they aren't fast enough. One common micro- optimization for tight loops and time-critical code is to create local variables from globals or builtins, because local variable access bypasses dict lookup. So people would notice if dicts were slower. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: Dynamic variable creation from string
Massi wrote: > for k in D : exec "%s = D[k]" %k > > That seems to do the trick, but someone speaks about "dirty code", can > anyone point me out which problems this can generate? exec can run arbitrary code, so everybody reading the above has to go back to the definition of D to verify that it can only contain "safe" keys. Filling D with user-input is right out because a malicious user could do anything he likes. Here's a harmless demo that creates a file: >>> d = {"x = 42\nwith open('tmp.txt', 'w') as f:\n f.write('whatever')\nx": 123} >>> for k in d: exec "%s = d[k]" % k ... >>> x 123 >>> open("tmp.txt").read() 'whatever' -- http://mail.python.org/mailman/listinfo/python-list
Re: Dynamic variable creation from string
Thank you all for your replies, first of all my Sum function was an example simplifying what I have to do in my real funciton. In general the D dictionary is complex, with a lot of keys, so I was searching for a quick method to access all the variables in it without doing the explicit creation: a, b, c = D['a'], D['b'], D['c'] and without using directly the D dictionary (boring...). When I talked about nested function I meant both cases Chris, but this is not a really tighten bound. I tried to follow the hints of Chris together with some help by google and used the following code: for k in D : exec "%s = D[k]" %k That seems to do the trick, but someone speaks about "dirty code", can anyone point me out which problems this can generate? Again, thank you for your help! -- http://mail.python.org/mailman/listinfo/python-list
Re: I love the decorator in Python!!!
Am 08.12.2011 12:43 schrieb Chris Angelico: On Thu, Dec 8, 2011 at 10:22 PM, K.-Michael Aye wrote: I am still perplexed about decorators though, am happily using Python for many years without them, but maybe i am missing something? For example in the above case, if I want the names attached to each other with a comma, why wouldn't I just create a function doing exactly this? Why would I first write a single name generator and then decorate it so that I never can get single names anymore (this is the case, isn't it? Once decorated, I can not get the original behaviour of the function anymore. The example given is a toy. It's hardly useful. Right. It was supposed to be an example. In my case, I work with a script used to build a XML file. I change this script from time to time in order to match the requirements. Here I find it useful just to add some more yield statements for adding entries. But now that I think again about it, it's more an example for generators, not so much for decorators - which I like as well. * But some useful examples for decorators include 1. "Threadifying" a function, i.e. turning it into a Thread object, or into a callable which in turn starts a thread according to the given parameters. 2. Automatically calling a function if the given module is executed as a script, a kind of replacement for the "if __name__ == '__main__':" stuff. 3. Meta decorators: I find it annoying to have to wrap the function given to the decorator into another one, modifying its properties and returning that in turn. def wrapfunction(decorated): """Wrap a function taking (f, *a, **k) and replace it with a function taking (f) and returning a function taking (*a, **k) which calls our decorated function. """ from functools import wraps @wraps(decorated) def wrapped_outer(f): @wraps(f) def wrapped_inner(*a, **k): return decorated(f, *a, **k) return wrapped_inner return wrapped_outer makes it much easier to create decorators which just wrap a function into another, extending its funtionality: @wrapfunction def return_list(f, *a, **k) return list(f(*a, **k)) is much easier and IMHO much better to read than def return_list(f): """Wrap a function taking (f, *a, **k) and replace it with a function taking (f) and returning a function taking (*a, **k) which calls our decorated function. """ from functools import wraps @wraps(f) def wrapped(*a, **k): return list(f, *a, **k) return wrapped - especially if used multiple times. 3a. This is a modified case of my first example: If you want a function to assemble and return a list instead of a generator object, but prefer "yield" over "ret=[]; ret.append();...", you can do that with this @return_list. 4. So-called "indirect decorators": @spam(eggs) def foo(bar): pass are as well quite tricky to build when taking def indirdeco(ind): from functools import update_wrapper, wraps upd=wraps(ind) # outer wrapper: replaces a call with *a, **k with an updated # lambda, getting the function to be wrapped and applying it and # *a, **k to ind. outerwrapper=lambda *a, **k: upd(lambda f: ind(f, *a, **k)) # We update this as well: return upd(outerwrapper) # We don't update f nor the result of ind() - it is the callee's # business. It is kind of reverse to 3. @indirdeco def addingdeco(f, offset): return lambda *a, **k: f(*a, **k) + offset # Here should maybe be wrapped - it is just supposed to be an # example. 5. Creating a __all__ for a module. Instead of maintaining it somewhere centrally, you can take a class AllList(list): """list which can be called in order to be used as a __all__-adding decorator""" def __call__(self, obj): """for decorators""" self.append(obj.__name__) return obj , do a __all__ = AllList() and subsequently decorate each function with @__all__ 6. Re-use a generator: A generator object is creted upon calling the generator function with parameters and can be used only once. A object wrapping this generator might be useful. # Turn a generator into a iterable object calling the generator. class GeneratorIterable(object): """Take a parameterless generator function and call it on every iteration.""" def __init__(self, gen): # Set object attribute. self.gen = gen def __iter__(self): # Class attribute calls object attribute in order to keep # namespace variety small. return self.gen() @GeneratorIterable def mygen(): yield 1 yield 2 list(mygen) -> [1, 2] list(mygen) -> [1, 2] # again, without the () Might be useful if the object is to be transferred to somewhere else. * Some of these decorators are more useful, some less if seen standalone, but very handy if creating other decorators. HTH nevertheless, Thomas -- http://mail.python.org/mai
Re: I love the decorator in Python!!!
On 12/9/11 5:02 AM, alex23 wrote: On Dec 9, 2:38 am, Chris Angelico wrote: One piece of sophistication that I would rather like to see, but don't know how to do. Instead of *args,**kwargs, is it possible to somehow copy in the function's actual signature? I was testing this out in IDLE, and the fly help for the function no longer gave useful info about its argument list. The 3rd party 'decorator' module takes care of issues like docstrings & function signatures. I'd really like to see some of that functionality in the stdlib though. Much of it is: http://docs.python.org/library/functools#functools.update_wrapper -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco -- http://mail.python.org/mailman/listinfo/python-list
Re: Misleading error message of the day
Ethan Furman wrote: Jean-Michel Pichavant wrote: You have to opportunity to not use unpacking anymore :o) There is a recent thread were the dark side of unpacking was exposed. Unpacking is a cool feautre for very small applications but should be avoided whenever possible otherwise. Which thread was that? ~Ethan~ "A tuple in order to pass returned values ?" was the thread. JM -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen under windows 7
Thank you very much. Now I have written a little c++ programm which produces some ouput. And now it works fine. There is something wrong with 7zip.exe and the arglist with *. Tonight I will go on and hunt the error. It should be Python 2.7 #!/usr/bin/env python PATH_TO_EXE = "C:/Users/yoicks/Desktop/ausgabe.exe" import os, shutil, sys, subprocess, traceback try: # win32 from msvcrt import getch except ImportError: # unix def getch(): import sys, tty, termios fd = sys.stdin.fileno() old = termios.tcgetattr(fd) try: tty.setraw(fd) return sys.stdin.read(1) finally: termios.tcsetattr(fd, termios.TCSADRAIN, old) def press_any_key(): print "Press any key to continue." getch() def exit_with_string(exit_string): print exit_string press_any_key() sys.exit(exit_string) def start_exe (PATH_TO_EXE): try: arglist = [PATH_TO_EXE] print ("try running cmd:\n %s\n" % (' '.join(arglist))) sp = subprocess.Popen(args=arglist, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True) except: print "Error while running subprocess.\n" print "Traceback:\n%s"%traceback.format_exc() return False output, error = sp.communicate() if output: print output if error: print error return False return True return_value = start_exe(PATH_TO_EXE) if return_value: print("Backup successfully written.") else: print("FAILURE during the backup") press_any_key() * Englisch - erkannt * Englisch * Deutsch * Englisch * Deutsch -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen under windows 7
On 09/12/2011 08:32, Ulrich Eckhardt wrote: Am 08.12.2011 23:41, schrieb Frank van den Boom: arglist = [PATH_TO_7ZIP,"a", "-sfx", archive_name, "*", "-r", "-p",PASSWORD] The "*" is resolved by the shell, this is not a wildcard that gets passed to the program. At least not normally, your case might be different. "... not normally" == "... not on Unix". On Windows, the shell doesn't do any wildcard expansion. The OP is asking about behaviour on Windows 7. TJG -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen under windows 7
Am 08.12.2011 23:41, schrieb Frank van den Boom: arglist = [PATH_TO_7ZIP,"a", "-sfx", archive_name, "*", "-r", "-p",PASSWORD] The "*" is resolved by the shell, this is not a wildcard that gets passed to the program. At least not normally, your case might be different. if output: print output print ("Eyerthing is good") Is that Python 2 or 3? That said, if you reduced it to something that doesn't e.g. require 7zip I'd happily run it on an XP system with Python 2.7 to tell you if it works there or not. Doing so would also rule out any influence by 7zip, just in case. Uli -- http://mail.python.org/mailman/listinfo/python-list
Re: Multiprocessing bug, is my editor (SciTE) impeding my progress?
Thanks once again to everyone for their recommendations, here's a follow-up. In summary, I'm still baffled. I tried ipython, as Marco Nawijn suggested. If there is some special setting which returns control to the interpreter when a subprocess crashes, I haven't found it yet. Yes, I'm RTFM. As with SciTE, everything just hangs. So I went back to SciTE for now. And I'm doing what Terry Reedy suggested -- I am editing multiprocess.Pool in place. I made a backup, of course. I am using sudo to run SciTE so that I can edit the system files, and not have to worry about chasing path and import statement problems. What I have found, so far, is no evidence that a string is needed in any of the code. What's the task variable? It's a deeply-nested tuple, containing no strings, not even in the WORKING code. This makes me wonder whether that traceback is truly complete. I wrote a routine to display the contents of task, immediately before the offending put(). Here's a breakdown. In the WORKING version: task: 0 0 (see below) {} task[3]: (see below) task[3][0]: (see below) task[3][0][1]: (see below) task[3][0][1][0]: net shape=(2, 3) inpshape=(307, 2) tgtshape=(307, 2) By watching this run, I've learned that task[0] and task[1] are counters for groups of subprocesses and individual subprocesses, respectively. Suppose we have four subprocesses. When everything is working, task[:2] = [0,0] for the first call, then [0,1], [0,2], [0,3]; then, [1,0], [1,1], [1,2], etc. task[2] points to multiprocessing.Pool.mapstar, a one-line function that I never modify. task[4] is an empty dictionary. So it looks like everything that I provide appears in task[3]. task[3] is just a tuple inside a tuple (which is weird). task[3][0] contains the function to be called (in this case, my function, mean_square_error), and then a tuple containing all of the arguments to be passed to that function. The docs say that the function in question must be defined at the top level of the code so that it's importable (it is), and that all the arguments to be sent to that function will be wrapped up in a single tuple -- that is presumably task[3][0][1]. But that presumption is wrong. I wrote a function which creates a collections.namedtuple object of the type SplitData, which contains the function's arguments. It's not task[3][0][1] itself, but the tuple INSIDE it, namely task[3][0][1][0]. More weirdness. You don't need to worry about task[3][0][1][0], other than to note that these are my neural network objects, they are intact, they are the classes I expect, and they are named as I expect -- and that there are NO STRING objects. Now, are there any differences between the working version of my code and the buggy version? Other than a few trivial name changes that I made deliberately, the structure of task looks the SAME... task: 0 0 (see below) {} task[3]: (see below) task[3][0]: (see below) task[3][0][1]: (see below) task[3][0][1][0]: funcshape=(2, 3) inpshape=(307, 2) tgtshape=(307, 2) Again, all the action is in task[3]. I was worried about the empty dictionary in task[4] at first, but I've seen this {} in the working program, too. I'm not sure what it does. For completeness, here's mean_square_error() from the working program: def mean_square_error(b): out = array([b.net(i) for i in b.inp]) return sum((out-b.tgt)**2) And, here's error() from the buggy program. def error(b): out = array([b.func(i) for i in b.inp]) return sum((out-b.tgt)**2) I renamed mean_square_error(), because I realized that the mean-square error is the only kind of error I'll ever be computing. I also renamed "net" to "func", in SplitData, reflecting the more general nature of the Cascade class I'm developing. So I mirror that name change here. Other than that, I trust you can see that error() and mean_square_error() are identical. I can call mean_square_error directly with a SplitData tuple and it works. I can call error directly with a SplitData tuple in the broken program, and it ALSO works. I'm only having problems when I try to submit the job through Pool. I tried putting a print trap in error(). When I use Pool then error() never gets called. I suppose that the logical next step is to compare the two Pool instances... onward... :^P -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen under windows 7
I didn't have Windows 7 right now, but that shouldn't happen with the code you've given; when trimming code for posting, you should check that the trimmed code still have the exact same problem. Here is the hole code: #!/usr/bin/env python # little script to backup recursive a folder with 7zip SOURCE_DIR = "C:/Users/yoicks/Desktop/source" DEST_DIR = "C:/Users/yoicks/Desktop/dest" BACKUP_NAME_PREFIX = "BACKUP" BACKUP_NAME_DELIMITER = "_" METHOD = '7zip' PATH_TO_7ZIP = "C:/Program Files/7-Zip/7z.exe" PASSWORD = "1234" import os, time, shutil, sys, tarfile, subprocess, traceback try: # win32 from msvcrt import getch except ImportError: # unix def getch(): import sys, tty, termios fd = sys.stdin.fileno() old = termios.tcgetattr(fd) try: tty.setraw(fd) return sys.stdin.read(1) finally: termios.tcsetattr(fd, termios.TCSADRAIN, old) def press_any_key(): print "Press any key to continue." getch() def exit_with_string(exit_string): print exit_string press_any_key() sys.exit(exit_string) def backup_directory_7zip(srcdir,archive_name): if os.path.exists(archive_name): exit_stop("backup path %s already exists!" % arcpath) try: # see 7zip help arglist = [PATH_TO_7ZIP,"a", "-sfx", archive_name, "*", "-r", "-p",PASSWORD] print ("try running cmd:\n %s\nin directory\n %s" % (' '.join(arglist),srcdir)) # join because i don't want [ ] sp = subprocess.Popen(args=arglist, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=srcdir) #output, error = subprocess.Popen(args=arglist, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=srcdir).communicate() except: print "Error while running 7zip subprocess.\n" print "Traceback:\n%s"%traceback.format_exc() return False output, error = sp.communicate() #something i tried output = sp.stdout.read() #somtehing i tried error = sp.stderr.read() if output: print output if error: print error return False return archive_name # build backup name print "start backup with python-script...\n" timestr = time.strftime("%Y%m%d_%H%M%S",time.localtime()) if METHOD not in ["7zip"]: exit_stop("METHOD not '7zip'") if not os.path.exists(SOURCE_DIR): exit_stop("SOURCE_DIR: %s doesn't exists" % os.path.abspath(SOURCE_DIR)) if not os.path.exists(DEST_DIR): exit_stop("DEST_DIR: %s doesn't exists" % os.path.abspath(DEST_DIR)) else: print("write backup from %s to %s \n using the %s method...\n" % (os.path.abspath(SOURCE_DIR), os.path.abspath(DEST_DIR), METHOD)) if METHOD == "7zip": try: if not os.path.exists(PATH_TO_7ZIP): exit_stop("Path to 7ZIP %s doesn't exist." % PATH_TO_7ZIP) except NameError: exit_stop("variable PATH_TO_7ZIP not defined") return_value = backup_directory_7zip(srcdir=os.path.abspath(SOURCE_DIR), archive_name=os.path.abspath(os.path.join( DEST_DIR, BACKUP_NAME_PREFIX + BACKUP_NAME_DELIMITER + timestr + ".exe"))) if return_value: print("Backup successfully written.") else: print("FAILURE during the backup") press_any_key() -- http://mail.python.org/mailman/listinfo/python-list