Re: [GENERAL] Duplicate Row Removal

2005-11-05 Thread Berend Tober

Dean Gibson (DB Administrator) wrote:


CREATE TABLE new_name AS SELECT DISTINCT * FROM old_name;

DROP TABLE old_name;

ALTER TABLE new_name RENAME TO old_name;



The problem with this technique is that it doesn't account for indexes, 
foreign key references, and other dependencies.


Another approach is to temporarily add an integer column, populate it 
with sequential values, and then use that new column to uniquely 
identify the rows that are otherwise duplicates. Then you can use 
aggregation to identify and delete the rows you don't need, followed by 
dropping the temporary extra column. HTH.


-- BMT



On 2005-11-04 17:15, Peter Atkins wrote:


All,

I have a duplicate row problem and to make matters worse some tables 
don't have a PK or any unique identifier.

Anyone have any thoughts on how to remove dups?




---(end of broadcast)---
TIP 6: explain analyze is your friend




---(end of broadcast)---
TIP 4: Have you searched our list archives?

  http://archives.postgresql.org


Re: [GENERAL] Duplicate Row Removal

2005-11-04 Thread Dean Gibson (DB Administrator)

CREATE TABLE new_name AS SELECT DISTINCT * FROM old_name;

DROP TABLE old_name;

ALTER TABLE new_name RENAME TO old_name;

On 2005-11-04 17:15, Peter Atkins wrote:

All,

I have a duplicate row problem and to make matters worse some tables don't have a PK or any unique identifier. 


Anyone have any thoughts on how to remove dups?



---(end of broadcast)---
TIP 6: explain analyze is your friend


Re: [GENERAL] Duplicate Row Removal

2005-11-04 Thread Tom Lane
"Peter Atkins" <[EMAIL PROTECTED]> writes:
> I was hoping to have a system oid for each row but it looks like that's
> not the case. Anyone have any thoughts on how to remove dups?

ctid always works ...

regards, tom lane

---(end of broadcast)---
TIP 1: if posting/reading through Usenet, please send an appropriate
   subscribe-nomail command to [EMAIL PROTECTED] so that your
   message can get through to the mailing list cleanly