Hi,
As regards to repetitive data, this is a bit tricky. The dataset contains
some varchars with repetitive data, but due to the amount of different
values, it defeats the purpose of putting them into a seperate table.
Unfortunately most of the fields in the table are numbers which cannot be
op
Here's more info I was able to track down...
I am making a function call to a c function in my SQL, example:
SELECT funct(arg) FROM x;
Here's the create statment for function func:
CREATE FUCNTION func(path)
RETURNS point
AS '${PGLIB}/xxx.so' LANGUAGE 'c';
Inside the c function funct, when
Thanks for the help...
I am using vers. 7.1.2 of PostgreSQL on Redhat Linux 7.1 (2.4 kernel). I am running a
set of external functions (in C) that are dynamincally loaded when needed (in a .so)
that do some computations on rows. It works fine if I do a few rows, but crashes when
done ni "bat
Hi Steve,
I don't no of a direct method, e.g. turning off the 'die' behaviour of copy
etc., but when I had the similar situation, I found it easier to just load
the file in vi and clean the data first. Admittedly I only had a couple of
hundred lines tho'.
Alternatively, you could write a sma
Hi!
I am wondering if it is ok to put the odbc driver for windows on a central
university server? This will make it available to everyone that has access
through the university Netware network.
Please let me know.
Thank you!
--
Shannon Peevey
UNT-Central Web Support
[EMAIL PROTECTED]
940-36
Is there a method to change the datatype of a field? (e.g., convert a
'timestamp' field to a 'date' field)
Thanks!
---(end of broadcast)---
TIP 5: Have you checked our extensive FAQ?
http://www.postgresql.org/users-lounge/docs/faq.html
Yea it worked for me, as well.
Xue-Feng Yang <[EMAIL PROTECTED]> wrote in article
<[EMAIL PROTECTED]>...
| --- Henry House <[EMAIL PROTECTED]> wrote:
...cut
---(end of broadcast)---
TIP 4: Don't 'kill -9' the postmaster
Hello all:
I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad row it
just aborts, leaving me to go back, delete or clean up the ro
Hello all:
Sorry for the bad subject line on the last version of this post.
I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad ro
The script file that I was talking about can be found at
/usr/lib/postgresql/share/contrib/array_iterator.sq
Luis Sousa
Luis Sousa wrote:
> Hello
>
> I know that exists a script file in postgres (I think) that have all of
> these operators. But what is the name of the file ?
>
> Is it in the p
10 matches
Mail list logo