Re: Removing Duplicate Records

2009-07-14 Thread Matt Neimeyer
Ah... Yes. Good point. I like this because I was planning on keeping the output somewhere for a while. (In case we need an "accounting" at some point) So it will be easy enough to dump what's being deleted to the screen while we loop over our candidates. Thanks! On Tue, Jul 14, 2009 at 10:16 AM,

Re: Removing Duplicate Records

2009-07-14 Thread Brent Baisley
That's assuming that there is a unique identifier field, like an auto increment field. Although that could be added after the fact. Also, you need to run the query multiple times until it returns no affected records. So if there are 4 copies of a record, it would need to be run 3 times to get rid o

Re: Removing Duplicate Records

2009-07-14 Thread Marcus Bointon
You can combine the two queries you have in option 3 (you'll need to change field names, but you should get the idea), something like this: DELETE table1 FROM table1, (SELECT MAX(id) AS dupid, COUNT(id) AS dupcnt FROM table1 WHERE field1 IS NOT NULL GROUP BY link_id HAVING dupcnt>1) AS dups W

RE: Removing Duplicate Records

2009-07-14 Thread Nathan Sullivan
@lists.mysql.com Subject: Removing Duplicate Records In our database we have an Organizations table and a Contacts table, and a linking table that associates Contacts with Organizations. Occassionally we manually add to the linking table with information gleaned from outside data sources. This is common

Removing Duplicate Records

2009-07-14 Thread Matt Neimeyer
In our database we have an Organizations table and a Contacts table, and a linking table that associates Contacts with Organizations. Occassionally we manually add to the linking table with information gleaned from outside data sources. This is common enough to be annoying, since it ends up with du

RE: removing duplicate records

2002-07-31 Thread Gordon
If you use IGNORE in the insert IGNORE into new_table you will get the result you want. > -Original Message- > From: walt [mailto:[EMAIL PROTECTED]] > Sent: Tuesday, July 30, 2002 3:11 PM > To: David Kramer; [EMAIL PROTECTED] > Subject: Re: removing duplicate records &g

RE: removing duplicate records

2002-07-30 Thread David Kramer
sue. DK -Original Message- From: Vivian Wang [mailto:[EMAIL PROTECTED]] Sent: Tuesday, July 30, 2002 2:37 PM To: walt; David Kramer; [EMAIL PROTECTED] Subject: Re: removing duplicate records Walt, Don't trust group by. I am using mysqldump and sort -u on *nix for duplicate. Try bo

Re: removing duplicate records

2002-07-30 Thread Vivian Wang
> > values contained in the other columns necessary? If you tell me more about > > what your trying to do and provide some Table DDL I can help you write this > > query. Just let me know! > > > > Thanks, > > > > DK > > > > group by statement >

Re: removing duplicate records

2002-07-30 Thread walt
provide some Table DDL I can help you write this > query. Just let me know! > > Thanks, > > DK > > group by statement > > -Original Message- > From: walt [mailto:[EMAIL PROTECTED]] > Sent: Tuesday, July 30, 2002 12:43 PM > To: [EMAIL PROTECTED] >

RE: removing duplicate records

2002-07-30 Thread David Kramer
statement -Original Message- From: walt [mailto:[EMAIL PROTECTED]] Sent: Tuesday, July 30, 2002 12:43 PM To: [EMAIL PROTECTED] Subject: removing duplicate records Does anyone know a good way find and remove duplicate records from a table? I can create an identical table structure and use

Re: removing duplicate records

2002-07-30 Thread Gerald Clark
If the columns are defined as not null, just add a unique index on all the fields. alter table ignore add unique(col1.col2.col#); walt wrote: >Does anyone know a good way find and remove duplicate records from a table? >I can create an identical table structure and use a script to pull records

removing duplicate records

2002-07-30 Thread walt
Does anyone know a good way find and remove duplicate records from a table? I can create an identical table structure and use a script to pull records from the existing table and insert them into the new table if they are not duplicates, but I'd rather not do it that way. Here is an example of a