[GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Greetings, Ive got a java application I am reading data from a flat file and inserting it into a table. The first 2 million rows (each file contained about 1 million lines) went pretty fast. Less than 40 mins to insert into the database. After that the insert speed is slow. I think I may

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Joshua D. Drake
On Fri, 2006-12-29 at 12:39 -0500, James Neff wrote: > Greetings, > > Ive got a java application I am reading data from a flat file and > inserting it into a table. The first 2 million rows (each file > contained about 1 million lines) went pretty fast. Less than 40 mins to > insert into the

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Rodrigo Gonzalez
James Neff wrote: Greetings, Ive got a java application I am reading data from a flat file and inserting it into a table. The first 2 million rows (each file contained about 1 million lines) went pretty fast. Less than 40 mins to insert into the database. After that the insert speed is sl

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Joshua D. Drake
> > there is also an index on batchid. > > > > The insert command is like so: > > > > "INSERT INTO data_archive (batchid, claimid, memberid, raw_data, status, > > line_number) VALUES ('" + commandBatchID + "', '', '', '" + raw_data + > > "', '1', '" + myFilter.claimLine + "');"; Also as you

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Joshua D. Drake wrote: Also as you are running 8.2 you can use multi valued inserts... INSERT INTO data_archive values () () () Would this speed things up? Or is that just another way to do it? Thanks, James ---(end of broadcast)--- TIP 6:

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Frank Finner
When do you commit these inserts? I occasionally found similiar problems, when I do heavy inserting/updating within one single transaction. First all runs fast, after some time everything slows down. If I commit the inserts every some 1000 rows (large rows, small engine), this phenomenon does no

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Joshua D. Drake wrote: You need to vacuum during the inserts :) Joshua D. Drake I ran the vacuum during the INSERT and it seemed to help a little, but its still relatively slow compared to the first 2 million records. Any other ideas? Thanks, James ---(end of br

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Joshua D. Drake
On Fri, 2006-12-29 at 13:21 -0500, James Neff wrote: > Joshua D. Drake wrote: > > Also as you are running 8.2 you can use multi valued inserts... > > > > INSERT INTO data_archive values () () () > > > > Would this speed things up? Or is that just another way to do it? The fastest way will be c

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Rodrigo Gonzalez
Joshua D. Drake wrote: On Fri, 2006-12-29 at 13:21 -0500, James Neff wrote: Joshua D. Drake wrote: Also as you are running 8.2 you can use multi valued inserts... INSERT INTO data_archive values () () () Would this speed things up? Or is that just another way to do it? The fastest way wi

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Frank Finner
In Java, assuming you have a Connection c, you simply say "c.commit();" after doing some action on the database. After every commit, the transaction will be executed and closed and a new one opened, which runs until the next commit. Regards, Frank. On Fri, 29 Dec 2006 13:23:37 -0500 James Neff

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Frank Finner wrote: In Java, assuming you have a Connection c, you simply say "c.commit();" after doing some action on the database. After every commit, the transaction will be executed and closed and a new one opened, which runs until the next commit. Regards, Frank. That did it, thank

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Nikola Milutinovic
> The fastest way will be copy. > The second fastest will be multi value inserts in batches.. eg.; > > INSERT INTO data_archive values () () () (I don't knwo what the max is) > > but commit every 1000 inserts or so. Is this some empirical value? Can someone give heuristics as to how to calculate

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread Guy Rouillier
Frank Finner wrote: In Java, assuming you have a Connection c, you simply say "c.commit();" after doing some action on the database. After every commit, the transaction will be executed and closed and a new one opened, which runs until the next commit. Assuming, of course, you started with c.se

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-31 Thread Nikola Milutinovic
> 1. There is no difference (speed-wise) between committing every 1K or every > 250K rows. It was really some time ago, since I have experimented with this. My las experiment was on PG 7.2 or 7.3. I was inserting cca 800,000 rows. Inserting without transactions took 25 hrs. Inserting with 10,00

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-31 Thread Richard Broersma Jr
> It was really some time ago, since I have experimented with this. My las > experiment was on PG > 7.2 or 7.3. I was inserting cca 800,000 rows. Inserting without transactions > took 25 hrs. > Inserting with 10,000 rows per transaction took about 2.5 hrs. So, the > speedup was 10x. I have > not

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-31 Thread Chad Wagner
On 12/31/06, Nikola Milutinovic <[EMAIL PROTECTED]> wrote: > 1. There is no difference (speed-wise) between committing every 1K or every 250K rows. It was really some time ago, since I have experimented with this. My las experiment was on PG 7.2 or 7.3. I was inserting cca 800,000 rows. Inserti