hi there,
I am trying to import large data files into pg.
for now i used the. xarg linux command to spawn the file line for line and set
and use the maximum available connections.
we use pg pool as connection pool to the database, and so try to maximize the
concurrent data import of the
On 16 jun. 2012, at 17:37, Edson Richter edsonrich...@hotmail.com wrote:
Em 16/06/2012 12:04, h...@101-factory.eu escreveu:
hi there,
I am trying to import large data files into pg.
for now i used the. xarg linux command to spawn the file line for line and
set and use the maximum
thanks all, i will be looking into it.
Met vriendelijke groet,
Henk
On 16 jun. 2012, at 18:23, Edson Richter edsonrich...@hotmail.com wrote:
Em 16/06/2012 12:59, h...@101-factory.eu escreveu:
thanks i thought about splitting the file, but that did no work out well.
so we receive 2 files
are there any usefull startup script when eiunning in master slave setup with
pg pool?
Henk
On 27 apr. 2012, at 19:22, leaf_yxj leaf_...@163.com wrote:
My os is redhat linux 5.5. And My database is greenplum 4.2.1( postgresql
8.2.15). I will take a look about the init.d directory.
run the script with bash -v or
-vvv for extra detailed vebose logging.
see whats wrong, most of the times a
matter using the right closure of the statements with ' or
Henk Bronk
On 27 mrt. 2012, at 20:37, W. David Jarvis william.d.jar...@gmail.com wrote:
Hello all -
I've been trying