On 03/01/2012 10:51 PM, Marti Raudsepp wrote:
The problem with IN() and ARRAY[] is that the whole list of numbers
has to be parsed by the SQL syntax parser, which has significant
memory and CPU overhead (it has to accept arbitrary expressions in the
list). But there's a shortcut around the
Quoting myself:
So, is there some common wisdom about the batch sizes? Or is it better
to do the inserts and deletes in just one batch? I think the case for
performance problems needs to be strong before default limits are
considered for PostgreSQL.
I did a little test about this. My test was
On Thu, Mar 1, 2012 at 21:06, Kääriäinen Anssi anssi.kaariai...@thl.fi wrote:
The queries are select * from the_table where id =
ANY(ARRAY[list_of_numbers])
and the similar delete, too.
[...] However, once you go into
millions of items in the list, the query will OOM my Postgres server.
Hello all,
I am trying to help the Django project by investigating if there should
be some default batch size limits for insert and delete queries. This is
realted to a couple of tickets which deal with SQLite's inability to
deal with more than 1000 parameters in a single query. That backend