thanks all, i will be looking into it.
Met vriendelijke groet,
Henk
On 16 jun. 2012, at 18:23, Edson Richter wrote:
> Em 16/06/2012 12:59, h...@101-factory.eu escreveu:
>> thanks i thought about splitting the file, but that did no work out well.
>>
>> so we receive 2
henk
On 16 jun. 2012, at 17:37, Edson Richter wrote:
> Em 16/06/2012 12:04, h...@101-factory.eu escreveu:
>> hi there,
>>
>> I am trying to import large data files into pg.
>> for now i used the. xarg linux command to spawn the file line for line and
>&g
hi there,
I am trying to import large data files into pg.
for now i used the. xarg linux command to spawn the file line for line and set
and use the maximum available connections.
we use pg pool as connection pool to the database, and so try to maximize the
concurrent data import of the fil
are there any usefull startup script when eiunning in master slave setup with
pg pool?
Henk
On 27 apr. 2012, at 19:22, leaf_yxj wrote:
> My os is redhat linux 5.5. And My database is greenplum 4.2.1( postgresql
> 8.2.15). I will take a look about the init.d directory.
>
> Thanks. Guys. Any
run the script with bash -v or
-vvv for extra detailed vebose logging.
see whats wrong, most of the times a
matter using the right closure of the statements with ' or "
Henk Bronk
On 27 mrt. 2012, at 20:37, "W. David Jarvis" wrote:
> Hello all -
>
> I've been trying to get a bash script s