New submission from David Decotigny <[EMAIL PROTECTED]>: I posted a recipe on ASPN: http://code.activestate.com/recipes/576462/ and Jesse, cheerleader for the inclusion of (multi)processing into python-core, suggested that it could be interesting to add this feature to the next pythons. This recipe is based on version 0.52 of the standalone "processing" package, and allows to avoid redundancy when multiple threads send the same job requests to a pool of background worker processes. The recipe details the why and the how. Some notes on the implementation, though: - There is a "Begin/End workaround" section in the code, which aims at working around a limitation of processing 0.52 (see comments and docstring for details). I sent issue #014431 to the issue tracker for processing on berlios, this would allow to get rid of this workaround - Read my comment #2 to the recipe, dealing with my thoughts of using weak references
---------- components: Library (Lib) messages: 72170 nosy: DavidDecotigny, jnoller severity: normal status: open title: allow multiple threads to efficiently send the same requests to a processing.Pool without incurring duplicate processing type: feature request versions: Python 2.6, Python 3.0 _______________________________________ Python tracker <[EMAIL PROTECTED]> <http://bugs.python.org/issue3735> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com