Op 15/05/20 om 00:36 schreef Stephane Tougard:


Hello,

A multithreaded software written in Python is connected with a Postgres
database. To avoid concurrent access issue with the database, it starts
a thread who receive all SQL request via queue.put and queue.get (it
makes only insert, so no issue with the return of the SQL request).

As long as it runs with 10 threads, no issues. At 100 threads, the
software is blocked by what I think is a locking issue.

I guess Python multithreading and queue are working good enough that it
can handle 100 threads with no issue (give me wrong here), so I guess
the problem is in my code.

It is not the number of threads in itself that can cause problems. But my experience is that if you have an unbounded queue and your producers out pace the consumers, that can cause problems. And if you have 100 times more producers as you have consumers that can easily be the case.

So my advice it to never use an unbounded queue. If the number of producers is small I go for a size of 10. If the number of producers gets larger, I go for a size between the number of producerers and the
double of that.

--
Antoon.
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to