On Thu, Jun 14, 2018 at 12:40 PM Tin Tvrtković <tinches...@gmail.com> wrote:
>
> Hi,
>
> I've been using asyncio a lot lately and have encountered this problem 
> several times. Imagine you want to do a lot of queries against a database, 
> spawning 10000 tasks in parallel will probably cause a lot of them to fail. 
> What you need in a task pool of sorts, to limit concurrency and do only 20 
> requests in parallel.
>
> If we were doing this synchronously, we wouldn't spawn 10000 threads using 
> 10000 connections, we would use a thread pool with a limited number of 
> threads and submit the jobs into its queue.
>
> To me, tasks are (somewhat) logically analogous to threads. The solution that 
> first comes to mind is to create an AsyncioTaskExecutor with a submit(coro, 
> *args, **kwargs) method. Put a reference to the coroutine and its arguments 
> into an asyncio queue. Spawn n tasks pulling from this queue and awaiting the 
> coroutines.
>
> It'd probably be useful to have this in the stdlib at some point.

Sounds like a good idea!  Feel free to open an issue to prototype the API.

Yury
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to