> > Hello!
> >
> > I am looking for an easy solution for eliminate duplicates but on a row
> > level.
> >
> > I am having 2 tables. 1 destination for all not duplicated info (a)
> > and 1 for input table (b) which might have duplicates related to table
> > a. Now I am using this kind of insert:
>
On Feb 4, 2008 7:17 PM, Baron Schwartz <[EMAIL PROTECTED]> wrote:
> On Feb 4, 2008 11:36 AM, Artifex Maximus <[EMAIL PROTECTED]> wrote:
> > Hello!
> >
> > I am looking for an easy solution for eliminate duplicates but on a row
> > level.
> >
> > I am having 2 tables. 1 destination for all not dupl
Hi,
On Feb 4, 2008 11:36 AM, Artifex Maximus <[EMAIL PROTECTED]> wrote:
> Hello!
>
> I am looking for an easy solution for eliminate duplicates but on a row level.
>
> I am having 2 tables. 1 destination for all not duplicated info (a)
> and 1 for input table (b) which might have duplicates relate
Hello!
I am looking for an easy solution for eliminate duplicates but on a row level.
I am having 2 tables. 1 destination for all not duplicated info (a)
and 1 for input table (b) which might have duplicates related to table
a. Now I am using this kind of insert:
INSERT INTO a
SELECT fields
FROM
If the ID doesn't represent anything, you can
CREATE TABLE new_table SELECT DISTINCT Row1, Row2 FROM old_table
And then recreate your index(es).
All your autoincrement IDs will be changed.
On 4/17/06, Patrick Aljord <[EMAIL PROTECTED]> wrote:
>
> On 4/18/06, William Fong <[EMAIL PROTECTED]> wr
Sample Data:
ID-Row1-Row2
1-A-B
2-A-B
Row1 and Row2 are duplicate, so you only want one. Which ID do you want?
-will
On 4/17/06, Patrick Aljord <[EMAIL PROTECTED]> wrote:
>
> hey all,
> I have a table "mytable" that looks like this:
> id tinyint primary key auto_increment
> row1 varchar 15
hey all,
I have a table "mytable" that looks like this:
id tinyint primary key auto_increment
row1 varchar 150
row2 varchar 150
I would like to remove all duplicates, which means that if n records
have the same row1 and row2, keep only one record and remove the
duplicates. Any idea how to do this?
Rich <[EMAIL PROTECTED]> wrote on 03/30/2006 09:11:56 PM:
> Hi there. Any quick way of killing duplicate records?
>
> Cheers
>
Yes. Some ways involve subqueries, others temporary tables. What version
are you on? What are your table definition(s) (use SHOW CREATE TABLE to
dump the defs)? How
Subquries will help you .
--Praj
On Thu, 30 Mar 2006 21:11:56 -0500
Rich <[EMAIL PROTECTED]> wrote:
> Hi there. Any quick way of killing duplicate records?
>
> Cheers
>
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[
Rich wrote:
Hi there. Any quick way of killing duplicate records?
Cheers
Subqueries probably.
--
Smileys rule (cX.x)C --o(^_^o)
Dance for me! ^(^_^)o (o^_^)o o(^_^)^ o(^_^o)
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.co
Hi there. Any quick way of killing duplicate records?
Cheers
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
- Original Message -
From: "Scott Haneda" <[EMAIL PROTECTED]>
To: "MySql"
Sent: Thursday, May 05, 2005 3:39 AM
Subject: The age old delete duplicates
> I have been researching on how to deal with duplicate data. While I have
a
> case where there can
on 5/5/05 2:52 AM, Jigal van Hemert at [EMAIL PROTECTED] wrote:
> - Original Message -
> From: "Scott Haneda"
>> on 5/5/05 2:11 AM, Joerg Bruehe at [EMAIL PROTECTED] wrote:
>> Basically, I have a chopping cart, this one is a little weird, for reasons
>> not worth explaining, you have a car
- Original Message -
From: "Scott Haneda"
> on 5/5/05 2:11 AM, Joerg Bruehe at [EMAIL PROTECTED] wrote:
> Basically, I have a chopping cart, this one is a little weird, for reasons
> not worth explaining, you have a cart when a user is not logged in, and
they
> *may* have one they made at s
on 5/5/05 2:11 AM, Joerg Bruehe at [EMAIL PROTECTED] wrote:
> Hi Scott, all!
>
> Scott Haneda wrote:
>> I have been researching on how to deal with duplicate data. [[...]]
>>
>> I am not in a situation where I can test for duplicates on insert, so the on
>> duplicate key update deal does me no
Hi Scott, all!
Scott Haneda wrote:
I have been researching on how to deal with duplicate data. [[...]]
I am not in a situation where I can test for duplicates on insert, so the on
duplicate key update deal does me no good here.
Most data points to selecting distinct records to a temp table, deleti
I have been researching on how to deal with duplicate data. While I have a
case where there can be duplicate data, and I want to get rid of it, the
general ideas I seem to find are as follows, plus my own, which I would like
opinions on since I have not seen it mentioned.
I am not in a situation
17 matches
Mail list logo