Copy may not work in our scenario since we need to join data from multiple
tables & then  convert it to json using  row_to_json . This json data
eventually  needs to be stored in a target table .

Will it be better if we break the process into batches of like 10,000 rows
& insert the data in its individual transactions? Or any other better
solution available ?

On Sat, Mar 9, 2024 at 9:01 PM hector vass <hector.v...@gmail.com> wrote:

>
>
> On Sat, Mar 9, 2024 at 3:02 PM kuldeep singh <kuldeeparor...@gmail.com>
> wrote:
>
>> Hi,
>>
>> We are inserting data close to 1M record & having a single Jsonb column
>> but query is getting stuck.
>>
>> We are using insert into select * .. , so all the operations are within
>> the DB.
>>
>> If we are running select query individually then it is returning the data
>> in 40 sec for all rows but with insert it is getting stuck.
>>
>> PG Version - 15.
>>
>> What could be the problem here ?
>>
>> Regards,
>> KD
>>
>
> insert 1M rows especially JSON that can be large, variable in size and
> stored as blobs and indexed is not perhaps the correct way to do this
> insert performance will also depend on your tuning.  Supporting
> transactions, users or bulk processing are 3x sides of a compromise.
> you should perhaps consider that insert is for inserting a few rows into
> live tables ... you might be better using copy or \copy, pg_dump if you are
> just trying to replicate a large table
>
>

Reply via email to