On 9/4/19 11:08 AM, Joao S. O. Bueno wrote:
I second that such a feature would be useful, as I am on the verge of implementing
a work-around for that in a project right now.

I'm sure I'm missing something, but isn't that the point of a
ThreadPoolExecutor?  Yes, you can submit more requests than you
have resources to execute concurrently, but the executor itself
limits the number of requests it executes at once to a given (to
the executor's initializer) number.  The "blocked" requests are
simply entries in a queue, and shouldn't consume lots of memory.

How does blocking the submit call differ from setting max_workers
in the call to ThreadPoolExecutor?

If the use case is uploading files, and you're reading the entire
file into memory before submitting a request to upload it, then
change that design to a ThreadPoolExecutor whose tasks read the
file into memory (preferebly in chunks, at that) and upload it.
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/G44HTODHKCOIYHOXNEN4R4Q6IUXFRHT2/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to