Re: Insert into on conflict, data size upto 3 billion records

2021-02-15 Thread Rob Sargent
On 2/15/21 12:22 PM, Karthik K wrote: yes, I'm using \copy to load the batch table, with the new design that we are doing, we expect updates to be less going forward and more inserts, one of the target columns I'm updating is indexed, so I will drop the index and try it out, also from your

Re: Insert into on conflict, data size upto 3 billion records

2021-02-15 Thread Tim Cross
Karthik K writes: > exactly, for now, what I did was, as the table is already partitioned, I > created 50 different connections and tried updating the target table by > directly querying from the source partition tables. Are there any other > techniques that I can use to speed this up? also

Re: Insert into on conflict, data size upto 3 billion records

2021-02-15 Thread Karthik K
yes, I'm using \copy to load the batch table, with the new design that we are doing, we expect updates to be less going forward and more inserts, one of the target columns I'm updating is indexed, so I will drop the index and try it out, also from your suggestion above splitting the on conflict

Re: Insert into on conflict, data size upto 3 billion records

2021-02-15 Thread Rob Sargent
On 2/15/21 11:41 AM, Karthik K wrote: exactly, for now, what I did was, as the table is already partitioned, I created 50 different connections and tried updating the target table by directly querying from the source partition tables. Are there any other techniques that I can use to speedĀ 

Re: Insert into on conflict, data size upto 3 billion records

2021-02-15 Thread Karthik K
exactly, for now, what I did was, as the table is already partitioned, I created 50 different connections and tried updating the target table by directly querying from the source partition tables. Are there any other techniques that I can use to speed this up? also when we use on conflict

Re: Insert into on conflict, data size upto 3 billion records

2021-02-13 Thread Ron
On 2/12/21 12:46 PM, Karthik Kumar Kondamudi wrote: Hi, I'm looking for suggestions on how I can improve the performance of the below merge statement, we have a batch process that batch load the data into the _batch tables using Postgres and the task is to update the main target tables if

Insert into on conflict, data size upto 3 billion records

2021-02-13 Thread Karthik Kumar Kondamudi
Hi, I'm looking for suggestions on how I can improve the performance of the below merge statement, we have a batch process that batch load the data into the _batch tables using Postgres and the task is to update the main target tables if the record exists else into it, sometime these batch table