On Apr 15, 11:12 pm, Paul Rubin <http://[EMAIL PROTECTED]> wrote:
> I'd like to suggest adding a new operation
>
>    Queue.finish()
>
> This puts a special sentinel object on the queue.  The sentinel
> travels through the queue like any other object, however, when
> q.get() encounters the sentinel, it raises StopIteration instead
> of returning the sentinel.  It does not remove the sentinel from
> the queue, so further calls to q.get also raise StopIteration.
> That permits writing the typical "worker thread" as

This is a pretty good idea.  However, it needs a custom __iter__
method to work... the syntax below is wrong on many levels.

>    for item in iter(q.get): ...

Once you implement __iter__, you are left with 'for item in q'.  The
main danger here is that all the threading synchro stuff is hidden in
the guts of the __iter__ implementation, which isn't terribly clear.
There is no way to handle Empty exceptions and use timeouts, for
instance.

> however that actually pops the sentinel, so if there are a lot of
> readers then the writing side has to push a separate sentinel for
> each reader.  I found my code cluttered with
>
>     for i in xrange(number_of_worker_threads):
>        q.put(sentinel)
>
> which certainly seems like a code smell to me.

Yeah, it kind of does.  Why not write a Queue + Worker manager that
keeps track of the number of workers, that has a .finish() method that
does this smelly task for you?

-Mike

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to