On 27.01.2012 20:30, Jayashankar K B wrote:
Hi Heikki Linnakangas: We are using series of Insert statements to insert the 
records into database.
Sending data in binary is not an option as the module that writes into DB has 
been finalized.
We do not have control over that.

That certainly limits your options.

Please let me know how we can proceed. On the net I couldn't get hold of any 
good example where Postgres has been used on limited Hardware system.

I don't think there's anything particular in postgres that would make it a poor choice on a small system, as far as CPU usage is concerned anyway. But inserting rows in a database is certainly slower than, say, writing them into a flat file.

At what rate are you doing the INSERTs? And how fast would they need to be? Remember that it's normal that while the INSERTs are running, postgres will use all the CPU it can to process them as fast as possible. So the question is, at what rate do they need to be processed to meet your target. Lowering the process priority with 'nice' might help too, to give the other important processes priority over postgres.

The easiest way to track down where the time is spent would be to run a profiler, if that's possible on your platform.

--
  Heikki Linnakangas
  EnterpriseDB   http://www.enterprisedb.com

--
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to