Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Nathaniel Smith
On Thu, Jun 14, 2018 at 3:31 PM, Tin Tvrtković wrote: > * my gut feeling is spawning a thousand tasks and having them all fighting > over the same semaphore and scheduling is going to be much less efficient > than a small number of tasks draining a queue. Fundamentally, a Semaphore is a queue: h

Re: [Python-Dev] Python-Dev Digest, Vol 179, Issue 21

2018-06-14 Thread casanova yassine
The Jseries acknowlegement by using Jetty containers can get you a best resolution To python wheel asynchronism bugs Envoyé à partir d’un Smarpthone Android avec GMX Mail. Le 14/06/2018, 4:00 PM python-dev-requ...@python.org a écrit: On 13 Jun 2018, at 15:42, Nick Coghlan mailto:ncogh...@gmail

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Tin Tvrtković
On Thu, Jun 14, 2018 at 10:03 PM Steve Dower wrote: > I often use > semaphores for this when I need it, and it looks like > asyncio.Semaphore() is sufficient for this: > > > import asyncio > task_limiter = asyncio.Semaphore(4) > > async def my_task(): > await task_limiter.acquire() > tr

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Tin Tvrtković
Other folks have already chimed in, so I'll be to the point. Try writing a simple asyncio web scraper (using maybe the aiohttp library) and create 5000 tasks for scraping different sites. My prediction is a whole lot of them will time out due to various reasons. Other responses inline. On Thu, Ju

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Steve Dower
On 14Jun2018 1214, Chris Barker via Python-Dev wrote: Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to understand the problem here. But if I have this right: I've been using asyncio a lot lately and have encountered this problem several times. Imagine you want

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Joni Orponen
On Thu, Jun 14, 2018 at 9:17 PM Chris Barker via Python-Dev < python-dev@python.org> wrote: > Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to > understand the problem here. > Vocabulary-wise 'queue depth' might be a suitable mental aid for what people actually want to li

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Chris Barker via Python-Dev
Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to understand the problem here. But if I have this right: I've been using asyncio a lot lately and have encountered this problem > several times. Imagine you want to do a lot of queries against a database, > spawning 1 tas

Re: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev)

2018-06-14 Thread Antoine Pitrou
On Fri, 8 Jun 2018 09:48:03 +0200 Victor Stinner wrote: > > Question: Do you think that bugs spotted by a GC collection are common > enough to change the GC thresholds in development mode (new -X dev > flag of Python 3.7)? I don't think replacing a more-or-less arbitrary value with another more-

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Gustavo Carneiro
On Thu, 14 Jun 2018 at 17:40, Tin Tvrtković wrote: > Hi, > > I've been using asyncio a lot lately and have encountered this problem > several times. Imagine you want to do a lot of queries against a database, > spawning 1 tasks in parallel will probably cause a lot of them to fail. > What you

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Yury Selivanov
On Thu, Jun 14, 2018 at 12:40 PM Tin Tvrtković wrote: > > Hi, > > I've been using asyncio a lot lately and have encountered this problem > several times. Imagine you want to do a lot of queries against a database, > spawning 1 tasks in parallel will probably cause a lot of them to fail. > W

Re: [Python-Dev] A more flexible task creation

2018-06-14 Thread Tin Tvrtković
Hi, I've been using asyncio a lot lately and have encountered this problem several times. Imagine you want to do a lot of queries against a database, spawning 1 tasks in parallel will probably cause a lot of them to fail. What you need in a task pool of sorts, to limit concurrency and do only