Sorry, I forgot to specify which OS. Yes it's RH 7.2, and PostgreSQL 7.1.3.
-Oprindelig meddelelse-
Fra: Shawn Tagseth [mailto:[EMAIL PROTECTED]]
Sendt: 20. marts 2002 17:25
Til: Rasmus Mohr
Emne: RE: [ADMIN] Failure loading TCL/u
Redhat 7.2?
-Original Message-
From: Rasmus Moh
Hi Jodi,
> None of the data is actually committed to the database until the scripts
> complete so I believe that autocommit is turned off.
>
what if you try to write the output of your script into a separate file
and pipe it to a psql as input? What i mean is to strip of the processing
time for
I have a set of data that will compose a table with 32 million rows. I currently run postgresql with tables as large as 750,000 rows.
Does anyone have experience with such large tables data. In addition, I have been reading information on moving postgresql tables to
another hard-drive can an
None of the data is actually committed to the database until the scripts
complete so I believe that autocommit is turned off.
- Original Message -
From: "Nick Fankhauser" <[EMAIL PROTECTED]>
To: "Jodi Kanter" <[EMAIL PROTECTED]>; "Postgres Admin List"
<[EMAIL PROTECTED]>
Sent: Wednesday,
Jodi-
Have you tried turning autocommit off & doing a single commit after the
load?
-NickF
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Jodi Kanter
Sent: Wednesday, March 20, 2002 2:19 PM
To: Postgres Admin List
Subject: [ADMIN] slow inserts
I am c
Ladislav Jirsa wrote:
> Hello!
>
> I run postgreSQL-7.2 on RedHat Linux 7.2, kernel 2.4. I want to import and
> export dbf-tables in and from postgres. I found pg2xbase library that
> requires xbase library. After installation, the conversion does not work
> (segmentation fault, exports empty tab
Jodi,
I believe DBI defaults to autocommitting queries, which
slows it down. If you aren't specifically turning off the autocommit, I'd
suggest it, which would make all the inserts go in one transaction which makes
it much faster. You'd do something like:
DBI->connect("dbi:Pg:dbnam
I am currently using a Perl data loader that was set up to load data to
three particular tables. The majority of this data is entered into one table.
The scripts obtain the data by parsing it out of an Excel spreadsheet.
My loads seem to take a very long time. Each file only has about 12,000
This thread helped me (i was using RH 7.2)
http://archives.postgresql.org/pgsql-hackers/2001-11/msg00312.php
I already had the src rpm downloaded, installed and re-compiled though, so
it was a simple matter of:
gcc -pipe -shared -Wl,-soname,libpltcl.so.0 -o pltcl.so pltcl.o -L/usr/lib
-ltcl -ld
>"Fred Moyer" <[EMAIL PROTECTED]> writes:
> ran time pg_dump -c --verbose database > datafile.psql from the command
> line and got a file size limit exceeded. datafile.psql stopped at 2 gigs.
> any ideas how to exceed that limit?
Workaround: Pipe output to gzip/bzip2, if compressed file size do
Hidey hidey hidey hi,
Have you tried piping the output to gzip then to another file? So
pg_dump -c --verbose database | gzip > /foo/bar.gzip?
I also use ftpbackup to move the gzip file to another server that has a ton
o diskspace and large file support. Here's a line from teh script
pg_dump C
At 12:15 AM 3/20/02 -0500, Tom Lane wrote:
>"Fred Moyer" <[EMAIL PROTECTED]> writes:
> > ran time pg_dump -c --verbose database > datafile.psql from the command
> line
> > and got a file size limit exceeded. datafile.psql stopped at 2 gigs. any
> > ideas how to exceed that limit?
>
> > redhat 7
Kristopher Yates wrote:
>
> select a, b, c where length(a=2);
>
> How do you do this? I couldn't find a postgres command for this kind of
> call.
SELECT a, b, c FROM table WHERE LENGTH(a) = 2;
best regards
--
Andreas 'ads' Scherbaum
--
Hi, everybody!
I was wonderring if there is somebody out there who could help me with
understand how index extensions work...
Let me state the problem first.
I have many (15) boolean attributes and I need to be able to search the
database for entries with any combination of those attributes fo
JC,
On Fri, Mar 15, 2002 at 02:11:21PM +0100, Jean-Christophe FABRE wrote:
> to a larger way,
> what is the best way to schedule tasks such as dump, vacuum, analyze under
> cygwin?
> as I notice before, under Linux (and generic UNIX systems), you can use
> cron to do that, but under cygwin?
Ye
Hello!
I run postgreSQL-7.2 on RedHat Linux 7.2, kernel 2.4. I want to import and
export dbf-tables in and from postgres. I found pg2xbase library that
requires xbase library. After installation, the conversion does not work
(segmentation fault, exports empty tables, distortion of numeric fields
select a, b, c where length(a=2);
How do you do this? I couldn't find a postgres command for this kind of
call.
Thanks
---(end of broadcast)---
TIP 3: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to [EMA
I've unsuccesfully tried loading TCL/u with the createlang command, every
time i get this error:
ERROR: Load of file /usr/lib/pgsql/pltcl.so failed:
/usr/lib/pgsql/pltcl.so: undefined symbol: Tcl_CreateSlave
createlang: language installation failed
Any suggestions?
18 matches
Mail list logo