On 10/25/06, Craig A. James <[EMAIL PROTECTED]> wrote:
Jim C. Nasby wrote:
> Well, given that perl is using an entire CPU, it sounds like you should
> start looking either at ways to remove some of the overhead from perl,
> or to split that perl into multiple processes.

I use Perl for big database copies (usually with some processing/transformation 
along the
way) and I've never seen 100% CPU usage except for brief periods, even when 
copying
BLOBS and such.  My typical copy divides operations into blocks, for example 
doing

I'm just doing CSV style transformations (and calling a lot of
functions along the way), but the end result is a straight bulk load
of data into a blank database.  And we've established that Postgres
can do *way* better than what I am seeing, so its not suprising that
perl is using 100% of a CPU.

However, I am still curious as to the rather slow COPYs from psql to
local disks.  Like I mentioned previously, I was only seeing about 5.7
MB/s (1.8 GB / 330 seconds), where it seemed like others were doing
substantially better.  What sorts of things should I look into?

Thanks!

---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend

Reply via email to